Dec 01 09:09:00 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Dec 01 09:09:00 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 01 09:09:00 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 09:09:00 localhost kernel: BIOS-provided physical RAM map:
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 01 09:09:00 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 01 09:09:00 localhost kernel: NX (Execute Disable) protection: active
Dec 01 09:09:00 localhost kernel: APIC: Static calls initialized
Dec 01 09:09:00 localhost kernel: SMBIOS 2.8 present.
Dec 01 09:09:00 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 01 09:09:00 localhost kernel: Hypervisor detected: KVM
Dec 01 09:09:00 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 01 09:09:00 localhost kernel: kvm-clock: using sched offset of 5344503434 cycles
Dec 01 09:09:00 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 01 09:09:00 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 01 09:09:00 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 01 09:09:00 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 01 09:09:00 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 01 09:09:00 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 01 09:09:00 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 01 09:09:00 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 01 09:09:00 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 01 09:09:00 localhost kernel: Using GB pages for direct mapping
Dec 01 09:09:00 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Dec 01 09:09:00 localhost kernel: ACPI: Early table checksum verification disabled
Dec 01 09:09:00 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 01 09:09:00 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 09:09:00 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 09:09:00 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 09:09:00 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 01 09:09:00 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 09:09:00 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 01 09:09:00 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 01 09:09:00 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 01 09:09:00 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 01 09:09:00 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 01 09:09:00 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 01 09:09:00 localhost kernel: No NUMA configuration found
Dec 01 09:09:00 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 01 09:09:00 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 01 09:09:00 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 01 09:09:00 localhost kernel: Zone ranges:
Dec 01 09:09:00 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 01 09:09:00 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 01 09:09:00 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 01 09:09:00 localhost kernel:   Device   empty
Dec 01 09:09:00 localhost kernel: Movable zone start for each node
Dec 01 09:09:00 localhost kernel: Early memory node ranges
Dec 01 09:09:00 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 01 09:09:00 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 01 09:09:00 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 01 09:09:00 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 01 09:09:00 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 01 09:09:00 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 01 09:09:00 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 01 09:09:00 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 01 09:09:00 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 01 09:09:00 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 01 09:09:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 01 09:09:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 01 09:09:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 01 09:09:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 01 09:09:00 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 01 09:09:00 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 01 09:09:00 localhost kernel: TSC deadline timer available
Dec 01 09:09:00 localhost kernel: CPU topo: Max. logical packages:   8
Dec 01 09:09:00 localhost kernel: CPU topo: Max. logical dies:       8
Dec 01 09:09:00 localhost kernel: CPU topo: Max. dies per package:   1
Dec 01 09:09:00 localhost kernel: CPU topo: Max. threads per core:   1
Dec 01 09:09:00 localhost kernel: CPU topo: Num. cores per package:     1
Dec 01 09:09:00 localhost kernel: CPU topo: Num. threads per package:   1
Dec 01 09:09:00 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 01 09:09:00 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 01 09:09:00 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 01 09:09:00 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 01 09:09:00 localhost kernel: Booting paravirtualized kernel on KVM
Dec 01 09:09:00 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 01 09:09:00 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 01 09:09:00 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 01 09:09:00 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 01 09:09:00 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 01 09:09:00 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 01 09:09:00 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 09:09:00 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Dec 01 09:09:00 localhost kernel: random: crng init done
Dec 01 09:09:00 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 01 09:09:00 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 01 09:09:00 localhost kernel: Fallback order for Node 0: 0 
Dec 01 09:09:00 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 01 09:09:00 localhost kernel: Policy zone: Normal
Dec 01 09:09:00 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 01 09:09:00 localhost kernel: software IO TLB: area num 8.
Dec 01 09:09:00 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 01 09:09:00 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Dec 01 09:09:00 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 01 09:09:00 localhost kernel: Dynamic Preempt: voluntary
Dec 01 09:09:00 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 01 09:09:00 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 01 09:09:00 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 01 09:09:00 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 01 09:09:00 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 01 09:09:00 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 01 09:09:00 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 01 09:09:00 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 01 09:09:00 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 09:09:00 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 09:09:00 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 01 09:09:00 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 01 09:09:00 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 01 09:09:00 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 01 09:09:00 localhost kernel: Console: colour VGA+ 80x25
Dec 01 09:09:00 localhost kernel: printk: console [ttyS0] enabled
Dec 01 09:09:00 localhost kernel: ACPI: Core revision 20230331
Dec 01 09:09:00 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 01 09:09:00 localhost kernel: x2apic enabled
Dec 01 09:09:00 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 01 09:09:00 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 01 09:09:00 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 01 09:09:00 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 01 09:09:00 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 01 09:09:00 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 01 09:09:00 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 01 09:09:00 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 01 09:09:00 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 01 09:09:00 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 01 09:09:00 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 01 09:09:00 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 01 09:09:00 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 01 09:09:00 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 01 09:09:00 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 01 09:09:00 localhost kernel: x86/bugs: return thunk changed
Dec 01 09:09:00 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 01 09:09:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 01 09:09:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 01 09:09:00 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 01 09:09:00 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 01 09:09:00 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 01 09:09:00 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 01 09:09:00 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 01 09:09:00 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 01 09:09:00 localhost kernel: landlock: Up and running.
Dec 01 09:09:00 localhost kernel: Yama: becoming mindful.
Dec 01 09:09:00 localhost kernel: SELinux:  Initializing.
Dec 01 09:09:00 localhost kernel: LSM support for eBPF active
Dec 01 09:09:00 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 01 09:09:00 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 01 09:09:00 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 01 09:09:00 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 01 09:09:00 localhost kernel: ... version:                0
Dec 01 09:09:00 localhost kernel: ... bit width:              48
Dec 01 09:09:00 localhost kernel: ... generic registers:      6
Dec 01 09:09:00 localhost kernel: ... value mask:             0000ffffffffffff
Dec 01 09:09:00 localhost kernel: ... max period:             00007fffffffffff
Dec 01 09:09:00 localhost kernel: ... fixed-purpose events:   0
Dec 01 09:09:00 localhost kernel: ... event mask:             000000000000003f
Dec 01 09:09:00 localhost kernel: signal: max sigframe size: 1776
Dec 01 09:09:00 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 01 09:09:00 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 01 09:09:00 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 01 09:09:00 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 01 09:09:00 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 01 09:09:00 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 01 09:09:00 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 01 09:09:00 localhost kernel: node 0 deferred pages initialised in 10ms
Dec 01 09:09:00 localhost kernel: Memory: 7765932K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Dec 01 09:09:00 localhost kernel: devtmpfs: initialized
Dec 01 09:09:00 localhost kernel: x86/mm: Memory block size: 128MB
Dec 01 09:09:00 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 01 09:09:00 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec 01 09:09:00 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 01 09:09:00 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 01 09:09:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 01 09:09:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 01 09:09:00 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 01 09:09:00 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 01 09:09:00 localhost kernel: audit: type=2000 audit(1764580137.786:1): state=initialized audit_enabled=0 res=1
Dec 01 09:09:00 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 01 09:09:00 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 01 09:09:00 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 01 09:09:00 localhost kernel: cpuidle: using governor menu
Dec 01 09:09:00 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 01 09:09:00 localhost kernel: PCI: Using configuration type 1 for base access
Dec 01 09:09:00 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 01 09:09:00 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 01 09:09:00 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 01 09:09:00 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 01 09:09:00 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 01 09:09:00 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 01 09:09:00 localhost kernel: Demotion targets for Node 0: null
Dec 01 09:09:00 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 01 09:09:00 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 01 09:09:00 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 01 09:09:00 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 01 09:09:00 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 01 09:09:00 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 01 09:09:00 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 01 09:09:00 localhost kernel: ACPI: Interpreter enabled
Dec 01 09:09:00 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 01 09:09:00 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 01 09:09:00 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 01 09:09:00 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 01 09:09:00 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 01 09:09:00 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 01 09:09:00 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [3] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [4] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [5] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [6] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [7] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [8] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [9] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [10] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [11] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [12] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [13] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [14] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [15] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [16] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [17] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [18] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [19] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [20] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [21] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [22] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [23] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [24] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [25] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [26] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [27] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [28] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [29] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [30] registered
Dec 01 09:09:00 localhost kernel: acpiphp: Slot [31] registered
Dec 01 09:09:00 localhost kernel: PCI host bridge to bus 0000:00
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 01 09:09:00 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 01 09:09:00 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 01 09:09:00 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 01 09:09:00 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 01 09:09:00 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 01 09:09:00 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 01 09:09:00 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 01 09:09:00 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 01 09:09:00 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 01 09:09:00 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 01 09:09:00 localhost kernel: iommu: Default domain type: Translated
Dec 01 09:09:00 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 01 09:09:00 localhost kernel: SCSI subsystem initialized
Dec 01 09:09:00 localhost kernel: ACPI: bus type USB registered
Dec 01 09:09:00 localhost kernel: usbcore: registered new interface driver usbfs
Dec 01 09:09:00 localhost kernel: usbcore: registered new interface driver hub
Dec 01 09:09:00 localhost kernel: usbcore: registered new device driver usb
Dec 01 09:09:00 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 01 09:09:00 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 01 09:09:00 localhost kernel: PTP clock support registered
Dec 01 09:09:00 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 01 09:09:00 localhost kernel: NetLabel: Initializing
Dec 01 09:09:00 localhost kernel: NetLabel:  domain hash size = 128
Dec 01 09:09:00 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 01 09:09:00 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 01 09:09:00 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 01 09:09:00 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 01 09:09:00 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 01 09:09:00 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 01 09:09:00 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 01 09:09:00 localhost kernel: vgaarb: loaded
Dec 01 09:09:00 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 01 09:09:00 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 01 09:09:00 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 01 09:09:00 localhost kernel: pnp: PnP ACPI init
Dec 01 09:09:00 localhost kernel: pnp 00:03: [dma 2]
Dec 01 09:09:00 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 01 09:09:00 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 01 09:09:00 localhost kernel: NET: Registered PF_INET protocol family
Dec 01 09:09:00 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 01 09:09:00 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 01 09:09:00 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 01 09:09:00 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 01 09:09:00 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 01 09:09:00 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 01 09:09:00 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 01 09:09:00 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 01 09:09:00 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 01 09:09:00 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 01 09:09:00 localhost kernel: NET: Registered PF_XDP protocol family
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 01 09:09:00 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 01 09:09:00 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 01 09:09:00 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 01 09:09:00 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 114861 usecs
Dec 01 09:09:00 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 01 09:09:00 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 01 09:09:00 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 01 09:09:00 localhost kernel: ACPI: bus type thunderbolt registered
Dec 01 09:09:00 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 01 09:09:00 localhost kernel: Initialise system trusted keyrings
Dec 01 09:09:00 localhost kernel: Key type blacklist registered
Dec 01 09:09:00 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 01 09:09:00 localhost kernel: zbud: loaded
Dec 01 09:09:00 localhost kernel: integrity: Platform Keyring initialized
Dec 01 09:09:00 localhost kernel: integrity: Machine keyring initialized
Dec 01 09:09:00 localhost kernel: Freeing initrd memory: 85868K
Dec 01 09:09:00 localhost kernel: NET: Registered PF_ALG protocol family
Dec 01 09:09:00 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 01 09:09:00 localhost kernel: Key type asymmetric registered
Dec 01 09:09:00 localhost kernel: Asymmetric key parser 'x509' registered
Dec 01 09:09:00 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 01 09:09:00 localhost kernel: io scheduler mq-deadline registered
Dec 01 09:09:00 localhost kernel: io scheduler kyber registered
Dec 01 09:09:00 localhost kernel: io scheduler bfq registered
Dec 01 09:09:00 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 01 09:09:00 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 01 09:09:00 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 01 09:09:00 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 01 09:09:00 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 01 09:09:00 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 01 09:09:00 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 01 09:09:00 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 01 09:09:00 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 01 09:09:00 localhost kernel: Non-volatile memory driver v1.3
Dec 01 09:09:00 localhost kernel: rdac: device handler registered
Dec 01 09:09:00 localhost kernel: hp_sw: device handler registered
Dec 01 09:09:00 localhost kernel: emc: device handler registered
Dec 01 09:09:00 localhost kernel: alua: device handler registered
Dec 01 09:09:00 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 01 09:09:00 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 01 09:09:00 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 01 09:09:00 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 01 09:09:00 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 01 09:09:00 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 01 09:09:00 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 01 09:09:00 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Dec 01 09:09:00 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 01 09:09:00 localhost kernel: hub 1-0:1.0: USB hub found
Dec 01 09:09:00 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 01 09:09:00 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 01 09:09:00 localhost kernel: usbserial: USB Serial support registered for generic
Dec 01 09:09:00 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 01 09:09:00 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 01 09:09:00 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 01 09:09:00 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 01 09:09:00 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 01 09:09:00 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 01 09:09:00 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 01 09:09:00 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-01T09:08:59 UTC (1764580139)
Dec 01 09:09:00 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 01 09:09:00 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 01 09:09:00 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 01 09:09:00 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 01 09:09:00 localhost kernel: usbcore: registered new interface driver usbhid
Dec 01 09:09:00 localhost kernel: usbhid: USB HID core driver
Dec 01 09:09:00 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 01 09:09:00 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 01 09:09:00 localhost kernel: Initializing XFRM netlink socket
Dec 01 09:09:00 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 01 09:09:00 localhost kernel: Segment Routing with IPv6
Dec 01 09:09:00 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 01 09:09:00 localhost kernel: mpls_gso: MPLS GSO support
Dec 01 09:09:00 localhost kernel: IPI shorthand broadcast: enabled
Dec 01 09:09:00 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 01 09:09:00 localhost kernel: AES CTR mode by8 optimization enabled
Dec 01 09:09:00 localhost kernel: sched_clock: Marking stable (1451001851, 174292915)->(1764981561, -139686795)
Dec 01 09:09:00 localhost kernel: registered taskstats version 1
Dec 01 09:09:00 localhost kernel: Loading compiled-in X.509 certificates
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 01 09:09:00 localhost kernel: Demotion targets for Node 0: null
Dec 01 09:09:00 localhost kernel: page_owner is disabled
Dec 01 09:09:00 localhost kernel: Key type .fscrypt registered
Dec 01 09:09:00 localhost kernel: Key type fscrypt-provisioning registered
Dec 01 09:09:00 localhost kernel: Key type big_key registered
Dec 01 09:09:00 localhost kernel: Key type encrypted registered
Dec 01 09:09:00 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 01 09:09:00 localhost kernel: Loading compiled-in module X.509 certificates
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec 01 09:09:00 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 01 09:09:00 localhost kernel: ima: No architecture policies found
Dec 01 09:09:00 localhost kernel: evm: Initialising EVM extended attributes:
Dec 01 09:09:00 localhost kernel: evm: security.selinux
Dec 01 09:09:00 localhost kernel: evm: security.SMACK64 (disabled)
Dec 01 09:09:00 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 01 09:09:00 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 01 09:09:00 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 01 09:09:00 localhost kernel: evm: security.apparmor (disabled)
Dec 01 09:09:00 localhost kernel: evm: security.ima
Dec 01 09:09:00 localhost kernel: evm: security.capability
Dec 01 09:09:00 localhost kernel: evm: HMAC attrs: 0x1
Dec 01 09:09:00 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 01 09:09:00 localhost kernel: Running certificate verification RSA selftest
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 01 09:09:00 localhost kernel: Running certificate verification ECDSA selftest
Dec 01 09:09:00 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 01 09:09:00 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 01 09:09:00 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 01 09:09:00 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 01 09:09:00 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 01 09:09:00 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 01 09:09:00 localhost kernel: clk: Disabling unused clocks
Dec 01 09:09:00 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 01 09:09:00 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 01 09:09:00 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec 01 09:09:00 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 01 09:09:00 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 01 09:09:00 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Dec 01 09:09:00 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 01 09:09:00 localhost kernel: Run /init as init process
Dec 01 09:09:00 localhost kernel:   with arguments:
Dec 01 09:09:00 localhost kernel:     /init
Dec 01 09:09:00 localhost kernel:   with environment:
Dec 01 09:09:00 localhost kernel:     HOME=/
Dec 01 09:09:00 localhost kernel:     TERM=linux
Dec 01 09:09:00 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Dec 01 09:09:00 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 01 09:09:00 localhost systemd[1]: Detected virtualization kvm.
Dec 01 09:09:00 localhost systemd[1]: Detected architecture x86-64.
Dec 01 09:09:00 localhost systemd[1]: Running in initrd.
Dec 01 09:09:00 localhost systemd[1]: No hostname configured, using default hostname.
Dec 01 09:09:00 localhost systemd[1]: Hostname set to <localhost>.
Dec 01 09:09:00 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 01 09:09:00 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 01 09:09:00 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 01 09:09:00 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 01 09:09:00 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 01 09:09:00 localhost systemd[1]: Reached target Local File Systems.
Dec 01 09:09:00 localhost systemd[1]: Reached target Path Units.
Dec 01 09:09:00 localhost systemd[1]: Reached target Slice Units.
Dec 01 09:09:00 localhost systemd[1]: Reached target Swaps.
Dec 01 09:09:00 localhost systemd[1]: Reached target Timer Units.
Dec 01 09:09:00 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 01 09:09:00 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 01 09:09:00 localhost systemd[1]: Listening on Journal Socket.
Dec 01 09:09:00 localhost systemd[1]: Listening on udev Control Socket.
Dec 01 09:09:00 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 01 09:09:00 localhost systemd[1]: Reached target Socket Units.
Dec 01 09:09:00 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 01 09:09:00 localhost systemd[1]: Starting Journal Service...
Dec 01 09:09:00 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 01 09:09:00 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 01 09:09:00 localhost systemd[1]: Starting Create System Users...
Dec 01 09:09:00 localhost systemd[1]: Starting Setup Virtual Console...
Dec 01 09:09:00 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 01 09:09:00 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 01 09:09:00 localhost systemd[1]: Finished Create System Users.
Dec 01 09:09:00 localhost systemd-journald[308]: Journal started
Dec 01 09:09:00 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/c016036bc2024470908b16395dc3b958) is 8.0M, max 153.6M, 145.6M free.
Dec 01 09:09:00 localhost systemd-sysusers[313]: Creating group 'users' with GID 100.
Dec 01 09:09:00 localhost systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Dec 01 09:09:00 localhost systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 01 09:09:00 localhost systemd[1]: Started Journal Service.
Dec 01 09:09:00 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 01 09:09:00 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 01 09:09:00 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 01 09:09:00 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 01 09:09:00 localhost systemd[1]: Finished Setup Virtual Console.
Dec 01 09:09:00 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 01 09:09:00 localhost systemd[1]: Starting dracut cmdline hook...
Dec 01 09:09:00 localhost dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Dec 01 09:09:00 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 01 09:09:00 localhost systemd[1]: Finished dracut cmdline hook.
Dec 01 09:09:00 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 01 09:09:00 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 01 09:09:00 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 01 09:09:00 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 01 09:09:00 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 01 09:09:00 localhost kernel: RPC: Registered udp transport module.
Dec 01 09:09:00 localhost kernel: RPC: Registered tcp transport module.
Dec 01 09:09:00 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 01 09:09:00 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 01 09:09:00 localhost rpc.statd[444]: Version 2.5.4 starting
Dec 01 09:09:01 localhost rpc.statd[444]: Initializing NSM state
Dec 01 09:09:01 localhost rpc.idmapd[449]: Setting log level to 0
Dec 01 09:09:01 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 01 09:09:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 01 09:09:01 localhost systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Dec 01 09:09:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 01 09:09:01 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 01 09:09:01 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 01 09:09:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 01 09:09:01 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 01 09:09:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 09:09:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 01 09:09:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 09:09:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 09:09:01 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 01 09:09:01 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 01 09:09:01 localhost systemd[1]: Reached target Network.
Dec 01 09:09:01 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 01 09:09:01 localhost systemd[1]: Starting dracut initqueue hook...
Dec 01 09:09:01 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 01 09:09:01 localhost systemd[1]: Reached target System Initialization.
Dec 01 09:09:01 localhost systemd[1]: Reached target Basic System.
Dec 01 09:09:01 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 01 09:09:01 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 01 09:09:01 localhost kernel:  vda: vda1
Dec 01 09:09:01 localhost kernel: libata version 3.00 loaded.
Dec 01 09:09:01 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 01 09:09:01 localhost kernel: scsi host0: ata_piix
Dec 01 09:09:01 localhost kernel: scsi host1: ata_piix
Dec 01 09:09:01 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 01 09:09:01 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 01 09:09:01 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec 01 09:09:01 localhost systemd[1]: Reached target Initrd Root Device.
Dec 01 09:09:01 localhost kernel: ata1: found unknown device (class 0)
Dec 01 09:09:01 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 01 09:09:01 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 01 09:09:01 localhost systemd-udevd[475]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:09:01 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 01 09:09:01 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 01 09:09:01 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 01 09:09:01 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 01 09:09:01 localhost systemd[1]: Finished dracut initqueue hook.
Dec 01 09:09:01 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 01 09:09:01 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 01 09:09:01 localhost systemd[1]: Reached target Remote File Systems.
Dec 01 09:09:01 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 01 09:09:01 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 01 09:09:01 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Dec 01 09:09:01 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec 01 09:09:01 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec 01 09:09:01 localhost systemd[1]: Mounting /sysroot...
Dec 01 09:09:02 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 01 09:09:02 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Dec 01 09:09:03 localhost kernel: XFS (vda1): Ending clean mount
Dec 01 09:09:03 localhost systemd[1]: Mounted /sysroot.
Dec 01 09:09:03 localhost systemd[1]: Reached target Initrd Root File System.
Dec 01 09:09:03 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 01 09:09:03 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 01 09:09:03 localhost systemd[1]: Reached target Initrd File Systems.
Dec 01 09:09:03 localhost systemd[1]: Reached target Initrd Default Target.
Dec 01 09:09:03 localhost systemd[1]: Starting dracut mount hook...
Dec 01 09:09:03 localhost systemd[1]: Finished dracut mount hook.
Dec 01 09:09:03 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 01 09:09:03 localhost rpc.idmapd[449]: exiting on signal 15
Dec 01 09:09:03 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 01 09:09:03 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 01 09:09:03 localhost systemd[1]: Stopped target Network.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Timer Units.
Dec 01 09:09:03 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 01 09:09:03 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Basic System.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Path Units.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Remote File Systems.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Slice Units.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Socket Units.
Dec 01 09:09:03 localhost systemd[1]: Stopped target System Initialization.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Local File Systems.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Swaps.
Dec 01 09:09:03 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut mount hook.
Dec 01 09:09:03 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 01 09:09:03 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 01 09:09:03 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 01 09:09:03 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 01 09:09:03 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 01 09:09:03 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 01 09:09:03 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 01 09:09:03 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 01 09:09:03 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 01 09:09:03 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 01 09:09:03 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 01 09:09:03 localhost systemd[1]: systemd-udevd.service: Consumed 1.032s CPU time.
Dec 01 09:09:03 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 01 09:09:03 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Closed udev Control Socket.
Dec 01 09:09:03 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Closed udev Kernel Socket.
Dec 01 09:09:03 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 01 09:09:03 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 01 09:09:03 localhost systemd[1]: Starting Cleanup udev Database...
Dec 01 09:09:03 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 01 09:09:03 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 01 09:09:03 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Stopped Create System Users.
Dec 01 09:09:03 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 01 09:09:03 localhost systemd[1]: Finished Cleanup udev Database.
Dec 01 09:09:03 localhost systemd[1]: Reached target Switch Root.
Dec 01 09:09:03 localhost systemd[1]: Starting Switch Root...
Dec 01 09:09:03 localhost systemd[1]: Switching root.
Dec 01 09:09:03 localhost systemd-journald[308]: Journal stopped
Dec 01 09:09:04 localhost systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Dec 01 09:09:04 localhost kernel: audit: type=1404 audit(1764580143.558:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability open_perms=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:09:04 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:09:04 localhost kernel: audit: type=1403 audit(1764580143.674:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 01 09:09:04 localhost systemd[1]: Successfully loaded SELinux policy in 118.096ms.
Dec 01 09:09:04 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.311ms.
Dec 01 09:09:04 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 01 09:09:04 localhost systemd[1]: Detected virtualization kvm.
Dec 01 09:09:04 localhost systemd[1]: Detected architecture x86-64.
Dec 01 09:09:04 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:09:04 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Stopped Switch Root.
Dec 01 09:09:04 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 01 09:09:04 localhost systemd[1]: Created slice Slice /system/getty.
Dec 01 09:09:04 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 01 09:09:04 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 01 09:09:04 localhost systemd[1]: Created slice User and Session Slice.
Dec 01 09:09:04 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 01 09:09:04 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 01 09:09:04 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 01 09:09:04 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 01 09:09:04 localhost systemd[1]: Stopped target Switch Root.
Dec 01 09:09:04 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 01 09:09:04 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 01 09:09:04 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 01 09:09:04 localhost systemd[1]: Reached target Path Units.
Dec 01 09:09:04 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 01 09:09:04 localhost systemd[1]: Reached target Slice Units.
Dec 01 09:09:04 localhost systemd[1]: Reached target Swaps.
Dec 01 09:09:04 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 01 09:09:04 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 01 09:09:04 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 01 09:09:04 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 01 09:09:04 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 01 09:09:04 localhost systemd[1]: Listening on udev Control Socket.
Dec 01 09:09:04 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 01 09:09:04 localhost systemd[1]: Mounting Huge Pages File System...
Dec 01 09:09:04 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 01 09:09:04 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 01 09:09:04 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 01 09:09:04 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 01 09:09:04 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 01 09:09:04 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 09:09:04 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 01 09:09:04 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 01 09:09:04 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 01 09:09:04 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 01 09:09:04 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 01 09:09:04 localhost systemd[1]: Stopped Journal Service.
Dec 01 09:09:04 localhost systemd[1]: Starting Journal Service...
Dec 01 09:09:04 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 01 09:09:04 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 01 09:09:04 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 09:09:04 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 01 09:09:04 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 01 09:09:04 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 01 09:09:04 localhost kernel: fuse: init (API version 7.37)
Dec 01 09:09:04 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 01 09:09:04 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 01 09:09:04 localhost systemd[1]: Mounted Huge Pages File System.
Dec 01 09:09:04 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 01 09:09:04 localhost systemd-journald[678]: Journal started
Dec 01 09:09:04 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec 01 09:09:03 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 01 09:09:03 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Started Journal Service.
Dec 01 09:09:04 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 01 09:09:04 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 01 09:09:04 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 01 09:09:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 09:09:04 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 01 09:09:04 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 01 09:09:04 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 01 09:09:04 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 01 09:09:04 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 01 09:09:04 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 01 09:09:04 localhost kernel: ACPI: bus type drm_connector registered
Dec 01 09:09:04 localhost systemd[1]: Mounting FUSE Control File System...
Dec 01 09:09:04 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 01 09:09:04 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 01 09:09:04 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 01 09:09:04 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 01 09:09:04 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 01 09:09:04 localhost systemd[1]: Starting Create System Users...
Dec 01 09:09:04 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec 01 09:09:04 localhost systemd-journald[678]: Received client request to flush runtime journal.
Dec 01 09:09:04 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 01 09:09:04 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 01 09:09:04 localhost systemd[1]: Mounted FUSE Control File System.
Dec 01 09:09:04 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 01 09:09:04 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 01 09:09:04 localhost systemd[1]: Finished Create System Users.
Dec 01 09:09:04 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 01 09:09:04 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 01 09:09:04 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 01 09:09:04 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 01 09:09:04 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 01 09:09:04 localhost systemd[1]: Reached target Local File Systems.
Dec 01 09:09:04 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 01 09:09:04 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 01 09:09:04 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 01 09:09:04 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 01 09:09:04 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 01 09:09:04 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 01 09:09:04 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 01 09:09:04 localhost bootctl[698]: Couldn't find EFI system partition, skipping.
Dec 01 09:09:04 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 01 09:09:04 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 01 09:09:04 localhost systemd[1]: Starting Security Auditing Service...
Dec 01 09:09:04 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 01 09:09:04 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 01 09:09:04 localhost systemd[1]: Starting RPC Bind...
Dec 01 09:09:04 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 01 09:09:04 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 01 09:09:04 localhost systemd[1]: Started RPC Bind.
Dec 01 09:09:04 localhost augenrules[710]: /sbin/augenrules: No change
Dec 01 09:09:04 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 01 09:09:04 localhost augenrules[725]: No rules
Dec 01 09:09:04 localhost augenrules[725]: enabled 1
Dec 01 09:09:04 localhost augenrules[725]: failure 1
Dec 01 09:09:04 localhost augenrules[725]: pid 703
Dec 01 09:09:04 localhost augenrules[725]: rate_limit 0
Dec 01 09:09:04 localhost augenrules[725]: backlog_limit 8192
Dec 01 09:09:04 localhost augenrules[725]: lost 0
Dec 01 09:09:04 localhost augenrules[725]: backlog 3
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time 60000
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 01 09:09:04 localhost augenrules[725]: enabled 1
Dec 01 09:09:04 localhost augenrules[725]: failure 1
Dec 01 09:09:04 localhost augenrules[725]: pid 703
Dec 01 09:09:04 localhost augenrules[725]: rate_limit 0
Dec 01 09:09:04 localhost augenrules[725]: backlog_limit 8192
Dec 01 09:09:04 localhost augenrules[725]: lost 0
Dec 01 09:09:04 localhost augenrules[725]: backlog 3
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time 60000
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 01 09:09:04 localhost augenrules[725]: enabled 1
Dec 01 09:09:04 localhost augenrules[725]: failure 1
Dec 01 09:09:04 localhost augenrules[725]: pid 703
Dec 01 09:09:04 localhost augenrules[725]: rate_limit 0
Dec 01 09:09:04 localhost augenrules[725]: backlog_limit 8192
Dec 01 09:09:04 localhost augenrules[725]: lost 0
Dec 01 09:09:04 localhost augenrules[725]: backlog 0
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time 60000
Dec 01 09:09:04 localhost augenrules[725]: backlog_wait_time_actual 0
Dec 01 09:09:04 localhost systemd[1]: Started Security Auditing Service.
Dec 01 09:09:04 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 01 09:09:04 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 01 09:09:05 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 01 09:09:05 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 01 09:09:05 localhost systemd[1]: Starting Update is Completed...
Dec 01 09:09:05 localhost systemd[1]: Finished Update is Completed.
Dec 01 09:09:05 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Dec 01 09:09:05 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 01 09:09:05 localhost systemd[1]: Reached target System Initialization.
Dec 01 09:09:05 localhost systemd[1]: Started dnf makecache --timer.
Dec 01 09:09:05 localhost systemd[1]: Started Daily rotation of log files.
Dec 01 09:09:05 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 01 09:09:05 localhost systemd[1]: Reached target Timer Units.
Dec 01 09:09:05 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 01 09:09:05 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 01 09:09:05 localhost systemd[1]: Reached target Socket Units.
Dec 01 09:09:05 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 01 09:09:05 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 09:09:05 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 01 09:09:05 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 01 09:09:05 localhost systemd-udevd[745]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:09:05 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 01 09:09:05 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 01 09:09:05 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 01 09:09:05 localhost systemd[1]: Reached target Basic System.
Dec 01 09:09:05 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 01 09:09:05 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 01 09:09:05 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 01 09:09:05 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 01 09:09:05 localhost dbus-broker-lau[765]: Ready
Dec 01 09:09:05 localhost systemd[1]: Starting NTP client/server...
Dec 01 09:09:05 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 01 09:09:05 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 01 09:09:05 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 01 09:09:05 localhost systemd[1]: Started irqbalance daemon.
Dec 01 09:09:05 localhost chronyd[788]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 01 09:09:05 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 01 09:09:05 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:09:05 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:09:05 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 09:09:05 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 01 09:09:05 localhost chronyd[788]: Loaded 0 symmetric keys
Dec 01 09:09:05 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 01 09:09:05 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 01 09:09:05 localhost chronyd[788]: Using right/UTC timezone to obtain leap second data
Dec 01 09:09:05 localhost chronyd[788]: Loaded seccomp filter (level 2)
Dec 01 09:09:05 localhost systemd[1]: Starting User Login Management...
Dec 01 09:09:05 localhost systemd[1]: Started NTP client/server.
Dec 01 09:09:05 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 01 09:09:05 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 01 09:09:05 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 01 09:09:05 localhost systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 01 09:09:05 localhost systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 01 09:09:05 localhost systemd-logind[795]: New seat seat0.
Dec 01 09:09:05 localhost systemd[1]: Started User Login Management.
Dec 01 09:09:05 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 01 09:09:05 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 01 09:09:05 localhost kernel: Console: switching to colour dummy device 80x25
Dec 01 09:09:05 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 01 09:09:05 localhost kernel: [drm] features: -context_init
Dec 01 09:09:05 localhost kernel: [drm] number of scanouts: 1
Dec 01 09:09:05 localhost kernel: [drm] number of cap sets: 0
Dec 01 09:09:05 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 01 09:09:05 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 01 09:09:05 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 01 09:09:05 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 01 09:09:05 localhost kernel: kvm_amd: TSC scaling supported
Dec 01 09:09:05 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 01 09:09:05 localhost kernel: kvm_amd: Nested Paging enabled
Dec 01 09:09:05 localhost kernel: kvm_amd: LBR virtualization supported
Dec 01 09:09:05 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Dec 01 09:09:05 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 01 09:09:06 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 01 Dec 2025 09:09:06 +0000. Up 8.05 seconds.
Dec 01 09:09:06 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 01 09:09:06 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 01 09:09:06 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpq_31wj8l.mount: Deactivated successfully.
Dec 01 09:09:06 localhost systemd[1]: Starting Hostname Service...
Dec 01 09:09:06 localhost systemd[1]: Started Hostname Service.
Dec 01 09:09:06 np0005540827.novalocal systemd-hostnamed[856]: Hostname set to <np0005540827.novalocal> (static)
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Reached target Preparation for Network.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Starting Network Manager...
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7036] NetworkManager (version 1.54.1-1.el9) is starting... (boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7041] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7103] manager[0x559d82d93080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7135] hostname: hostname: using hostnamed
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7135] hostname: static hostname changed from (none) to "np0005540827.novalocal"
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7138] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7280] manager[0x559d82d93080]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7280] manager[0x559d82d93080]: rfkill: WWAN hardware radio set enabled
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7312] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7313] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7314] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7315] manager: Networking is enabled by state file
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7317] settings: Loaded settings plugin: keyfile (internal)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7331] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7346] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7359] dhcp: init: Using DHCP client 'internal'
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7361] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7371] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7378] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7386] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7395] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7398] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7423] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7427] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7431] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7433] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7435] device (eth0): carrier: link connected
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7439] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7444] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7450] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7453] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7455] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7457] manager: NetworkManager state is now CONNECTING
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7459] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7463] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7467] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7508] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7514] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7530] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Started Network Manager.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Reached target Network.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Reached target NFS client services.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Reached target Remote File Systems.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7946] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7951] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7961] device (lo): Activation: successful, device activated.
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7971] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7975] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7981] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7986] device (eth0): Activation: successful, device activated.
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7994] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 09:09:06 np0005540827.novalocal NetworkManager[860]: <info>  [1764580146.7999] manager: startup complete
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 01 09:09:06 np0005540827.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 01 Dec 2025 09:09:07 +0000. Up 8.98 seconds.
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |  eth0  | True |        38.102.83.236         | 255.255.255.0 | global | fa:16:3e:93:dd:4f |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |  eth0  | True | fe80::f816:3eff:fe93:dd4f/64 |       .       |  link  | fa:16:3e:93:dd:4f |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 01 09:09:07 np0005540827.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Dec 01 09:09:08 np0005540827.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Generating public/private rsa key pair.
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key fingerprint is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: SHA256:Hf2/bMGIOmm19hmCBAbx/KsTHat5LiaJdZFLW9Z+yD0 root@np0005540827.novalocal
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key's randomart image is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +---[RSA 3072]----+
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |     o.          |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |      +    .     |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |       =. o .    |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |      .+o= o .   |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |      . So* + +  |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |     . =.oo* E + |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |    o o +o+.o.. o|
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |   . o *.* o. +..|
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |      o.*.o .o.o |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +----[SHA256]-----+
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Generating public/private ecdsa key pair.
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key fingerprint is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: SHA256:pnvoy1Xm64y1phbDmDAD0Zm7N2uom4AXpknHBHYAjtI root@np0005540827.novalocal
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key's randomart image is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +---[ECDSA 256]---+
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |o+o+ o           |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |+.+ +            |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |o.Eo .           |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |. o =            |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: | .oo = +S o      |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |o+... =o++       |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |= .  o.+.oo      |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: | o ...+oo+.o     |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |  +o o=+o+=      |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +----[SHA256]-----+
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Generating public/private ed25519 key pair.
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key fingerprint is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: SHA256:+q9QobtCfQSjfEGg5REHwbJ23SGI1kdheTnmvSW+g2I root@np0005540827.novalocal
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: The key's randomart image is:
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +--[ED25519 256]--+
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |   +BO*o .       |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |  ++++B *        |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: | ..+.+ X.+       |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |  o + o.+.o .    |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: | . . o..S. +     |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |    . .+. o      |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |   .  +. . .     |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |    . E+. o      |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: |     o..oo..     |
Dec 01 09:09:08 np0005540827.novalocal cloud-init[924]: +----[SHA256]-----+
Dec 01 09:09:08 np0005540827.novalocal sm-notify[1006]: Version 2.5.4 starting
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Reached target Network is Online.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting System Logging Service...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting Permit User Sessions...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 01 09:09:08 np0005540827.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 01 09:09:08 np0005540827.novalocal sshd[1008]: Server listening on :: port 22.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Finished Permit User Sessions.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started Command Scheduler.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started Getty on tty1.
Dec 01 09:09:08 np0005540827.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 01 09:09:08 np0005540827.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 01 09:09:08 np0005540827.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 47% if used.)
Dec 01 09:09:08 np0005540827.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Reached target Login Prompts.
Dec 01 09:09:08 np0005540827.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec 01 09:09:08 np0005540827.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Started System Logging Service.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Reached target Multi-User System.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 01 09:09:08 np0005540827.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 01 09:09:08 np0005540827.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:09:09 np0005540827.novalocal kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec 01 09:09:09 np0005540827.novalocal kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1129]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 01 Dec 2025 09:09:09 +0000. Up 10.98 seconds.
Dec 01 09:09:09 np0005540827.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 01 09:09:09 np0005540827.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 01 09:09:09 np0005540827.novalocal dracut[1268]: dracut-057-102.git20250818.el9
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1286]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 01 Dec 2025 09:09:09 +0000. Up 11.38 seconds.
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1290]: #############################################################
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1292]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1300]: 256 SHA256:pnvoy1Xm64y1phbDmDAD0Zm7N2uom4AXpknHBHYAjtI root@np0005540827.novalocal (ECDSA)
Dec 01 09:09:09 np0005540827.novalocal dracut[1270]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1306]: 256 SHA256:+q9QobtCfQSjfEGg5REHwbJ23SGI1kdheTnmvSW+g2I root@np0005540827.novalocal (ED25519)
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1315]: 3072 SHA256:Hf2/bMGIOmm19hmCBAbx/KsTHat5LiaJdZFLW9Z+yD0 root@np0005540827.novalocal (RSA)
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1317]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1319]: #############################################################
Dec 01 09:09:09 np0005540827.novalocal cloud-init[1286]: Cloud-init v. 24.4-7.el9 finished at Mon, 01 Dec 2025 09:09:09 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.59 seconds
Dec 01 09:09:09 np0005540827.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 01 09:09:09 np0005540827.novalocal systemd[1]: Reached target Cloud-init target.
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: memstrack is not available
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: memstrack is not available
Dec 01 09:09:10 np0005540827.novalocal sshd-session[1829]: Unable to negotiate with 38.102.83.114 port 50188: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 01 09:09:10 np0005540827.novalocal dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 01 09:09:10 np0005540827.novalocal sshd-session[1838]: Connection closed by 38.102.83.114 port 50202 [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1853]: Unable to negotiate with 38.102.83.114 port 50228: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1845]: Unable to negotiate with 38.102.83.114 port 50218: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1818]: Connection closed by 38.102.83.114 port 50186 [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1861]: Connection reset by 38.102.83.114 port 50230 [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1884]: Unable to negotiate with 38.102.83.114 port 50254: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1888]: Unable to negotiate with 38.102.83.114 port 50256: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 01 09:09:11 np0005540827.novalocal sshd-session[1871]: Connection closed by 38.102.83.114 port 50242 [preauth]
Dec 01 09:09:11 np0005540827.novalocal dracut[1270]: *** Including module: systemd ***
Dec 01 09:09:11 np0005540827.novalocal dracut[1270]: *** Including module: fips ***
Dec 01 09:09:11 np0005540827.novalocal chronyd[788]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec 01 09:09:11 np0005540827.novalocal chronyd[788]: System clock TAI offset set to 37 seconds
Dec 01 09:09:11 np0005540827.novalocal dracut[1270]: *** Including module: systemd-initrd ***
Dec 01 09:09:11 np0005540827.novalocal dracut[1270]: *** Including module: i18n ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: drm ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: prefixdevname ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: kernel-modules ***
Dec 01 09:09:12 np0005540827.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: kernel-modules-extra ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: qemu ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: fstab-sys ***
Dec 01 09:09:12 np0005540827.novalocal dracut[1270]: *** Including module: rootfs-block ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: terminfo ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: udev-rules ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: Skipping udev rule: 91-permissions.rules
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: virtiofs ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: dracut-systemd ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: usrmount ***
Dec 01 09:09:13 np0005540827.novalocal dracut[1270]: *** Including module: base ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: fs-lib ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: kdumpbase ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:   microcode_ctl module: mangling fw_dir
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: openssl ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: shutdown ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including module: squash ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Including modules done ***
Dec 01 09:09:14 np0005540827.novalocal dracut[1270]: *** Installing kernel module dependencies ***
Dec 01 09:09:15 np0005540827.novalocal dracut[1270]: *** Installing kernel module dependencies done ***
Dec 01 09:09:15 np0005540827.novalocal dracut[1270]: *** Resolving executable dependencies ***
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 25 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 31 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 28 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 32 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 30 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 01 09:09:16 np0005540827.novalocal irqbalance[789]: IRQ 29 affinity is now unmanaged
Dec 01 09:09:16 np0005540827.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: *** Resolving executable dependencies done ***
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: *** Generating early-microcode cpio image ***
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: *** Store current command line parameters ***
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: Stored kernel commandline:
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: No dracut internal kernel commandline stored in the initramfs
Dec 01 09:09:17 np0005540827.novalocal dracut[1270]: *** Install squash loader ***
Dec 01 09:09:18 np0005540827.novalocal dracut[1270]: *** Squashing the files inside the initramfs ***
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: *** Squashing the files inside the initramfs done ***
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: *** Hardlinking files ***
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Mode:           real
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Files:          50
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Linked:         0 files
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Compared:       0 xattrs
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Compared:       0 files
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Saved:          0 B
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: Duration:       0.001003 seconds
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: *** Hardlinking files done ***
Dec 01 09:09:19 np0005540827.novalocal dracut[1270]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Dec 01 09:09:20 np0005540827.novalocal kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec 01 09:09:20 np0005540827.novalocal kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec 01 09:09:20 np0005540827.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 01 09:09:20 np0005540827.novalocal systemd[1]: Startup finished in 1.907s (kernel) + 3.535s (initrd) + 16.949s (userspace) = 22.392s.
Dec 01 09:09:31 np0005540827.novalocal sshd-session[4298]: Accepted publickey for zuul from 38.102.83.114 port 50350 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 01 09:09:31 np0005540827.novalocal systemd-logind[795]: New session 1 of user zuul.
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Queued start job for default target Main User Target.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Created slice User Application Slice.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Reached target Paths.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Reached target Timers.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Starting D-Bus User Message Bus Socket...
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Starting Create User's Volatile Files and Directories...
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Finished Create User's Volatile Files and Directories.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Reached target Sockets.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Reached target Basic System.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Reached target Main User Target.
Dec 01 09:09:31 np0005540827.novalocal systemd[4302]: Startup finished in 100ms.
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 01 09:09:31 np0005540827.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 01 09:09:31 np0005540827.novalocal sshd-session[4298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:09:32 np0005540827.novalocal python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:36 np0005540827.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:09:36 np0005540827.novalocal python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:44 np0005540827.novalocal python3[4472]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:45 np0005540827.novalocal python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 01 09:09:47 np0005540827.novalocal python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs83Me/XJ93JONH+A3ys3BwT4zj02WAeI+PLa+4ictmx5jo+8RBm+8bQesnDGHtSEP3xHjam8Fwfo48sUz5kG1CEXeLWH7xBEXZQ+pidesIq17dWuB2YicfBCHGhZlqb9l/fISdA7PnN5BsCCyr5hQUlvwUPLq0dzE02EgJGcgUqI2ytoS8AvmZ5RX7c4IqGNOi3dFOny3uCDUlNZf/m10t5Eqaq53DNvn55ZT7HmuZuq1QSut2qopHMOrbqUIx17TPb+KiAJG5h8+CV0pJKLq1fSsJaTqR/MZTXsPF5oJHMT5BqnKmRCBNJyY+ko1jZA3a2jF3MqcxIxwgndHOIWitGlByPkFLlWfLV78+yskN9w1nWzxFvEhkCexTCcqU8TmYGBBjKU4l0icf9POdHjr9cZVQmRYdIveeEtZJS0R8S9Tx1uYEuLAXYurVEYBQXuNDw4iQV4pSabQVesX8t9KwUTkxMg2kUXIjvBcHSEiT6wtG+W/j0byNv0sj6FU2EM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:09:47 np0005540827.novalocal python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:48 np0005540827.novalocal python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:09:48 np0005540827.novalocal python3[4732]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.155165-254-138612375389639/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa follow=False checksum=c0f0a3fd8bd6e06ffcd4372a522626913bfa295a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:49 np0005540827.novalocal python3[4855]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:09:49 np0005540827.novalocal python3[4926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.977265-309-227908802802880/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa.pub follow=False checksum=0bbaabac56f17c62b907e9f050ef8c82d5faceb9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:51 np0005540827.novalocal python3[4974]: ansible-ping Invoked with data=pong
Dec 01 09:09:52 np0005540827.novalocal python3[4998]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:09:54 np0005540827.novalocal python3[5056]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 01 09:09:55 np0005540827.novalocal python3[5088]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:56 np0005540827.novalocal python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:56 np0005540827.novalocal python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:56 np0005540827.novalocal python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:56 np0005540827.novalocal python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:57 np0005540827.novalocal python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:58 np0005540827.novalocal sudo[5232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmpgoocdvobuwnnhbxqfzgetjxdjhbuh ; /usr/bin/python3'
Dec 01 09:09:58 np0005540827.novalocal sudo[5232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:58 np0005540827.novalocal python3[5234]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:09:58 np0005540827.novalocal sudo[5232]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:59 np0005540827.novalocal sudo[5310]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdnxcqkpyrrneaeoqkrncomhxowslqfb ; /usr/bin/python3'
Dec 01 09:09:59 np0005540827.novalocal sudo[5310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:09:59 np0005540827.novalocal python3[5312]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:09:59 np0005540827.novalocal sudo[5310]: pam_unix(sudo:session): session closed for user root
Dec 01 09:09:59 np0005540827.novalocal sudo[5383]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuxwndlsiabrdzlrahuickiqrnmjyaiy ; /usr/bin/python3'
Dec 01 09:09:59 np0005540827.novalocal sudo[5383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:00 np0005540827.novalocal python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580199.1842887-35-111050150736677/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:00 np0005540827.novalocal sudo[5383]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:00 np0005540827.novalocal python3[5433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:00 np0005540827.novalocal python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:01 np0005540827.novalocal python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:01 np0005540827.novalocal python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:01 np0005540827.novalocal python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:02 np0005540827.novalocal python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:02 np0005540827.novalocal python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:02 np0005540827.novalocal python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:02 np0005540827.novalocal python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:03 np0005540827.novalocal python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:03 np0005540827.novalocal python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:03 np0005540827.novalocal python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:03 np0005540827.novalocal python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:04 np0005540827.novalocal python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:04 np0005540827.novalocal python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:04 np0005540827.novalocal python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:05 np0005540827.novalocal python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:05 np0005540827.novalocal python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:05 np0005540827.novalocal python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:05 np0005540827.novalocal python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:06 np0005540827.novalocal python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:06 np0005540827.novalocal python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:06 np0005540827.novalocal python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:07 np0005540827.novalocal python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:07 np0005540827.novalocal python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:07 np0005540827.novalocal python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:10:10 np0005540827.novalocal sudo[6057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrlfgigcbexgjspkqpmrzulxgiikehrv ; /usr/bin/python3'
Dec 01 09:10:10 np0005540827.novalocal sudo[6057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:11 np0005540827.novalocal python3[6059]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 09:10:11 np0005540827.novalocal systemd[1]: Starting Time & Date Service...
Dec 01 09:10:11 np0005540827.novalocal systemd[1]: Started Time & Date Service.
Dec 01 09:10:11 np0005540827.novalocal systemd-timedated[6061]: Changed time zone to 'UTC' (UTC).
Dec 01 09:10:11 np0005540827.novalocal sudo[6057]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:11 np0005540827.novalocal sudo[6088]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrkyuphqwuqvhpnrknnmhesigepbtyvn ; /usr/bin/python3'
Dec 01 09:10:11 np0005540827.novalocal sudo[6088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:11 np0005540827.novalocal python3[6090]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:11 np0005540827.novalocal sudo[6088]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:12 np0005540827.novalocal python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:10:12 np0005540827.novalocal python3[6237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764580211.8191283-254-174806744599992/source _original_basename=tmpj6smjkn4 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:12 np0005540827.novalocal python3[6337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:10:13 np0005540827.novalocal python3[6408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580212.7213066-304-264862167355596/source _original_basename=tmpoh0me2_k follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:14 np0005540827.novalocal sudo[6508]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokempsbmzohuyxtcgrguqrofwuukojy ; /usr/bin/python3'
Dec 01 09:10:14 np0005540827.novalocal sudo[6508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:14 np0005540827.novalocal python3[6510]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:10:14 np0005540827.novalocal sudo[6508]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:14 np0005540827.novalocal sudo[6581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mekppwpcngsjwgtlwdywdoxmuchpsldu ; /usr/bin/python3'
Dec 01 09:10:14 np0005540827.novalocal sudo[6581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:14 np0005540827.novalocal python3[6583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580213.8819036-384-85982835564823/source _original_basename=tmpn_uy5hkt follow=False checksum=958c7c038fe74051d420f8f1aa402f4dafe9a187 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:14 np0005540827.novalocal sudo[6581]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:15 np0005540827.novalocal python3[6631]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:15 np0005540827.novalocal python3[6657]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:15 np0005540827.novalocal sudo[6735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wogpxrllrytsyivvevhvbksrmpeahtnd ; /usr/bin/python3'
Dec 01 09:10:15 np0005540827.novalocal sudo[6735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:16 np0005540827.novalocal python3[6737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:10:16 np0005540827.novalocal sudo[6735]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:16 np0005540827.novalocal sudo[6808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hthendnrhmmvzuftoxvjxynlpohxwcgs ; /usr/bin/python3'
Dec 01 09:10:16 np0005540827.novalocal sudo[6808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:16 np0005540827.novalocal python3[6810]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580215.527902-454-11500478457558/source _original_basename=tmpbc11m6b5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:16 np0005540827.novalocal sudo[6808]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:16 np0005540827.novalocal sudo[6859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oatkmvodmxwjmmoizsbmmgihqgnzxfua ; /usr/bin/python3'
Dec 01 09:10:16 np0005540827.novalocal sudo[6859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:17 np0005540827.novalocal python3[6861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-bfee-2c1a-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:10:17 np0005540827.novalocal sudo[6859]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:17 np0005540827.novalocal chronyd[788]: Selected source 216.197.228.230 (2.centos.pool.ntp.org)
Dec 01 09:10:17 np0005540827.novalocal python3[6889]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-bfee-2c1a-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 01 09:10:19 np0005540827.novalocal python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:31 np0005540827.novalocal sshd-session[6919]: Received disconnect from 14.22.89.30 port 45534:11: Bye Bye [preauth]
Dec 01 09:10:31 np0005540827.novalocal sshd-session[6919]: Disconnected from authenticating user root 14.22.89.30 port 45534 [preauth]
Dec 01 09:10:40 np0005540827.novalocal sudo[6944]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amataxfmxocnfadncdcldxjebhvekhby ; /usr/bin/python3'
Dec 01 09:10:40 np0005540827.novalocal sudo[6944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:10:40 np0005540827.novalocal python3[6946]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:10:40 np0005540827.novalocal sudo[6944]: pam_unix(sudo:session): session closed for user root
Dec 01 09:10:41 np0005540827.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:11:23 np0005540827.novalocal chronyd[788]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec 01 09:11:40 np0005540827.novalocal sshd-session[4311]: Received disconnect from 38.102.83.114 port 50350:11: disconnected by user
Dec 01 09:11:40 np0005540827.novalocal sshd-session[4311]: Disconnected from user zuul 38.102.83.114 port 50350
Dec 01 09:11:40 np0005540827.novalocal sshd-session[4298]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:11:40 np0005540827.novalocal systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Dec 01 09:11:40 np0005540827.novalocal sshd-session[6949]: Received disconnect from 102.213.183.66 port 59420:11: Bye Bye [preauth]
Dec 01 09:11:40 np0005540827.novalocal sshd-session[6949]: Disconnected from authenticating user root 102.213.183.66 port 59420 [preauth]
Dec 01 09:11:42 np0005540827.novalocal systemd[4302]: Starting Mark boot as successful...
Dec 01 09:11:42 np0005540827.novalocal systemd[4302]: Finished Mark boot as successful.
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 01 09:12:10 np0005540827.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 01 09:12:10 np0005540827.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6166] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 09:12:10 np0005540827.novalocal systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6356] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6380] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6383] device (eth1): carrier: link connected
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6384] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6389] policy: auto-activating connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6392] device (eth1): Activation: starting connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6393] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6396] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6399] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:12:10 np0005540827.novalocal NetworkManager[860]: <info>  [1764580330.6402] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:12:11 np0005540827.novalocal sshd-session[6956]: Accepted publickey for zuul from 38.102.83.114 port 52198 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:12:11 np0005540827.novalocal systemd-logind[795]: New session 3 of user zuul.
Dec 01 09:12:11 np0005540827.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 01 09:12:11 np0005540827.novalocal sshd-session[6956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:12:12 np0005540827.novalocal python3[6983]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-4d84-78d9-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:12:22 np0005540827.novalocal sudo[7061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnxehatoxoedsrfloqsbtssqbyxxgejl ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 09:12:22 np0005540827.novalocal sudo[7061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:12:22 np0005540827.novalocal python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:12:22 np0005540827.novalocal sudo[7061]: pam_unix(sudo:session): session closed for user root
Dec 01 09:12:22 np0005540827.novalocal sudo[7134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxbiyswdkqzyewrfkiyjejmbqwwjyte ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 09:12:22 np0005540827.novalocal sudo[7134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:12:22 np0005540827.novalocal python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580341.982572-206-98903063548087/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=83626592cd5ab41e6130fd1a62b51a677a0d44a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:12:22 np0005540827.novalocal sudo[7134]: pam_unix(sudo:session): session closed for user root
Dec 01 09:12:22 np0005540827.novalocal sudo[7184]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrchsngdsdurybvxjrzqexapvvfykhsg ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 09:12:22 np0005540827.novalocal sudo[7184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:12:23 np0005540827.novalocal python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Stopping Network Manager...
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2772] caught SIGTERM, shutting down normally.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2789] dhcp4 (eth0): canceled DHCP transaction
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2789] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2790] dhcp4 (eth0): state changed no lease
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2794] manager: NetworkManager state is now CONNECTING
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2846] dhcp4 (eth1): canceled DHCP transaction
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2847] dhcp4 (eth1): state changed no lease
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[860]: <info>  [1764580343.2925] exiting (success)
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Stopped Network Manager.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: NetworkManager.service: Consumed 1.177s CPU time, 10.0M memory peak.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Starting Network Manager...
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.3525] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.3526] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.3610] manager[0x55dbbbdfd070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Starting Hostname Service...
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Started Hostname Service.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4651] hostname: hostname: using hostnamed
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4652] hostname: static hostname changed from (none) to "np0005540827.novalocal"
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4657] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4663] manager[0x55dbbbdfd070]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4663] manager[0x55dbbbdfd070]: rfkill: WWAN hardware radio set enabled
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4693] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4694] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4694] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4695] manager: Networking is enabled by state file
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4697] settings: Loaded settings plugin: keyfile (internal)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4702] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4729] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4738] dhcp: init: Using DHCP client 'internal'
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4741] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4745] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4750] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4758] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4765] device (eth0): carrier: link connected
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4769] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4774] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4774] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4780] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4787] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4793] device (eth1): carrier: link connected
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4798] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4803] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268) (indicated)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4803] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4808] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4815] device (eth1): Activation: starting connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Started Network Manager.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4823] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4828] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4830] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4832] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4835] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4839] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4842] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4847] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4850] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4860] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4864] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4878] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4903] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4909] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4988] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4995] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.4998] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 09:12:23 np0005540827.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5003] device (lo): Activation: successful, device activated.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5032] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5034] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5038] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5041] device (eth0): Activation: successful, device activated.
Dec 01 09:12:23 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580343.5049] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 09:12:23 np0005540827.novalocal sudo[7184]: pam_unix(sudo:session): session closed for user root
Dec 01 09:12:23 np0005540827.novalocal python3[7270]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-4d84-78d9-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:12:33 np0005540827.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:12:53 np0005540827.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1156] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:13:09 np0005540827.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:13:09 np0005540827.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1518] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1521] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1527] device (eth1): Activation: successful, device activated.
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1533] manager: startup complete
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1534] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <warn>  [1764580389.1540] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1547] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1616] dhcp4 (eth1): canceled DHCP transaction
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1616] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1617] dhcp4 (eth1): state changed no lease
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1630] policy: auto-activating connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1634] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1635] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1638] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1644] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1652] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1692] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1694] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:13:09 np0005540827.novalocal NetworkManager[7192]: <info>  [1764580389.1701] device (eth1): Activation: successful, device activated.
Dec 01 09:13:19 np0005540827.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:13:23 np0005540827.novalocal sshd-session[6959]: Received disconnect from 38.102.83.114 port 52198:11: disconnected by user
Dec 01 09:13:23 np0005540827.novalocal sshd-session[6959]: Disconnected from user zuul 38.102.83.114 port 52198
Dec 01 09:13:23 np0005540827.novalocal sshd-session[6956]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:13:23 np0005540827.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 01 09:13:23 np0005540827.novalocal systemd[1]: session-3.scope: Consumed 1.718s CPU time.
Dec 01 09:13:23 np0005540827.novalocal systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Dec 01 09:13:23 np0005540827.novalocal systemd-logind[795]: Removed session 3.
Dec 01 09:13:35 np0005540827.novalocal sshd-session[7298]: Accepted publickey for zuul from 38.102.83.114 port 45702 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:13:35 np0005540827.novalocal systemd-logind[795]: New session 4 of user zuul.
Dec 01 09:13:35 np0005540827.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 01 09:13:35 np0005540827.novalocal sshd-session[7298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:13:35 np0005540827.novalocal sudo[7377]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvtbjedjzfysqayafgjenawnrcsdunpe ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 09:13:35 np0005540827.novalocal sudo[7377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:35 np0005540827.novalocal python3[7379]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:13:35 np0005540827.novalocal sudo[7377]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:35 np0005540827.novalocal sudo[7450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giwrodjidtqirehyfizinkvnzhhkncvz ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 01 09:13:35 np0005540827.novalocal sudo[7450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:13:35 np0005540827.novalocal python3[7452]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580415.1467483-373-205250554000965/source _original_basename=tmpqze8o975 follow=False checksum=978dba8c6f7bc0ac5b14f81009c6504f60a75fb7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:13:35 np0005540827.novalocal sudo[7450]: pam_unix(sudo:session): session closed for user root
Dec 01 09:13:38 np0005540827.novalocal sshd-session[7301]: Connection closed by 38.102.83.114 port 45702
Dec 01 09:13:38 np0005540827.novalocal sshd-session[7298]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:13:38 np0005540827.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 01 09:13:38 np0005540827.novalocal systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Dec 01 09:13:38 np0005540827.novalocal systemd-logind[795]: Removed session 4.
Dec 01 09:13:52 np0005540827.novalocal sshd-session[7477]: Invalid user mike from 45.78.219.119 port 48462
Dec 01 09:13:52 np0005540827.novalocal sshd-session[7477]: Received disconnect from 45.78.219.119 port 48462:11: Bye Bye [preauth]
Dec 01 09:13:52 np0005540827.novalocal sshd-session[7477]: Disconnected from invalid user mike 45.78.219.119 port 48462 [preauth]
Dec 01 09:14:09 np0005540827.novalocal sshd-session[7479]: Invalid user soporte from 102.213.183.66 port 60632
Dec 01 09:14:09 np0005540827.novalocal sshd-session[7479]: Received disconnect from 102.213.183.66 port 60632:11: Bye Bye [preauth]
Dec 01 09:14:09 np0005540827.novalocal sshd-session[7479]: Disconnected from invalid user soporte 102.213.183.66 port 60632 [preauth]
Dec 01 09:14:42 np0005540827.novalocal systemd[4302]: Created slice User Background Tasks Slice.
Dec 01 09:14:42 np0005540827.novalocal systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Dec 01 09:14:42 np0005540827.novalocal systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Dec 01 09:15:27 np0005540827.novalocal sshd-session[7488]: Received disconnect from 102.213.183.66 port 45886:11: Bye Bye [preauth]
Dec 01 09:15:27 np0005540827.novalocal sshd-session[7488]: Disconnected from authenticating user root 102.213.183.66 port 45886 [preauth]
Dec 01 09:16:29 np0005540827.novalocal sshd-session[7491]: Invalid user testuser from 14.22.89.30 port 44456
Dec 01 09:16:30 np0005540827.novalocal sshd-session[7491]: Received disconnect from 14.22.89.30 port 44456:11: Bye Bye [preauth]
Dec 01 09:16:30 np0005540827.novalocal sshd-session[7491]: Disconnected from invalid user testuser 14.22.89.30 port 44456 [preauth]
Dec 01 09:16:36 np0005540827.novalocal sshd-session[7490]: Received disconnect from 222.73.135.87 port 49146:11: Bye Bye [preauth]
Dec 01 09:16:36 np0005540827.novalocal sshd-session[7490]: Disconnected from authenticating user root 222.73.135.87 port 49146 [preauth]
Dec 01 09:16:45 np0005540827.novalocal sshd-session[7496]: Invalid user sonarqube from 102.213.183.66 port 34194
Dec 01 09:16:45 np0005540827.novalocal sshd-session[7496]: Received disconnect from 102.213.183.66 port 34194:11: Bye Bye [preauth]
Dec 01 09:16:45 np0005540827.novalocal sshd-session[7496]: Disconnected from invalid user sonarqube 102.213.183.66 port 34194 [preauth]
Dec 01 09:16:50 np0005540827.novalocal sshd-session[7494]: Received disconnect from 45.78.219.119 port 57624:11: Bye Bye [preauth]
Dec 01 09:16:50 np0005540827.novalocal sshd-session[7494]: Disconnected from authenticating user root 45.78.219.119 port 57624 [preauth]
Dec 01 09:16:55 np0005540827.novalocal sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 7484
Dec 01 09:16:58 np0005540827.novalocal sshd[1008]: Timeout before authentication for connection from 14.103.123.167 to 38.102.83.236, pid = 7485
Dec 01 09:17:57 np0005540827.novalocal sshd[1008]: drop connection #1 from [14.22.89.30]:47804 on [38.102.83.236]:22 penalty: exceeded LoginGraceTime
Dec 01 09:17:59 np0005540827.novalocal sshd-session[7500]: Invalid user master from 102.213.183.66 port 40400
Dec 01 09:17:59 np0005540827.novalocal sshd-session[7500]: Received disconnect from 102.213.183.66 port 40400:11: Bye Bye [preauth]
Dec 01 09:17:59 np0005540827.novalocal sshd-session[7500]: Disconnected from invalid user master 102.213.183.66 port 40400 [preauth]
Dec 01 09:18:51 np0005540827.novalocal sshd[1008]: Timeout before authentication for connection from 120.48.35.181 to 38.102.83.236, pid = 7498
Dec 01 09:19:12 np0005540827.novalocal sshd-session[7502]: Received disconnect from 102.213.183.66 port 40042:11: Bye Bye [preauth]
Dec 01 09:19:12 np0005540827.novalocal sshd-session[7502]: Disconnected from authenticating user root 102.213.183.66 port 40042 [preauth]
Dec 01 09:19:32 np0005540827.novalocal sshd-session[7506]: Accepted publickey for zuul from 38.102.83.114 port 37662 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:19:32 np0005540827.novalocal systemd-logind[795]: New session 5 of user zuul.
Dec 01 09:19:32 np0005540827.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 01 09:19:32 np0005540827.novalocal sshd-session[7506]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:19:32 np0005540827.novalocal sudo[7533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmnwqcedqlxccqhdodourrikyqkpvtqm ; /usr/bin/python3'
Dec 01 09:19:32 np0005540827.novalocal sudo[7533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:32 np0005540827.novalocal python3[7535]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001cdc-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:32 np0005540827.novalocal sudo[7533]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:33 np0005540827.novalocal sudo[7562]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbxpnxguvkjavlismrpxwrvtablkmenr ; /usr/bin/python3'
Dec 01 09:19:33 np0005540827.novalocal sudo[7562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:33 np0005540827.novalocal python3[7564]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:33 np0005540827.novalocal sudo[7562]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:33 np0005540827.novalocal sudo[7588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghzordwcjsydptnnwcraxcgmgpmotepg ; /usr/bin/python3'
Dec 01 09:19:33 np0005540827.novalocal sudo[7588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:34 np0005540827.novalocal python3[7590]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:34 np0005540827.novalocal sudo[7588]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:34 np0005540827.novalocal sudo[7614]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjgbawuswimxxcmcqzidzafsgeeajutj ; /usr/bin/python3'
Dec 01 09:19:34 np0005540827.novalocal sudo[7614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:34 np0005540827.novalocal python3[7616]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:34 np0005540827.novalocal sudo[7614]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:35 np0005540827.novalocal sudo[7640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqvnybevtpzqopmyokdxmujhhmcadezj ; /usr/bin/python3'
Dec 01 09:19:35 np0005540827.novalocal sudo[7640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:35 np0005540827.novalocal python3[7642]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:35 np0005540827.novalocal sudo[7640]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:35 np0005540827.novalocal sudo[7666]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvjvvmvhbwskgvwukpvrlerlfbjtvstc ; /usr/bin/python3'
Dec 01 09:19:35 np0005540827.novalocal sudo[7666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:35 np0005540827.novalocal python3[7668]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:35 np0005540827.novalocal sudo[7666]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:35 np0005540827.novalocal sudo[7744]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fujrztebvplkxohmvqhxginjrkuwheuk ; /usr/bin/python3'
Dec 01 09:19:35 np0005540827.novalocal sudo[7744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:36 np0005540827.novalocal python3[7746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:19:36 np0005540827.novalocal sudo[7744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:37 np0005540827.novalocal sudo[7817]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbxndtadvbkioisomcejbxqzkliprfta ; /usr/bin/python3'
Dec 01 09:19:37 np0005540827.novalocal sudo[7817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:37 np0005540827.novalocal python3[7819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580775.770414-519-147553759139448/source _original_basename=tmp63qs6yud follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:19:37 np0005540827.novalocal sudo[7817]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:37 np0005540827.novalocal sudo[7867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqesjoibgaxnxajaoujokilqogafolfo ; /usr/bin/python3'
Dec 01 09:19:37 np0005540827.novalocal sudo[7867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:38 np0005540827.novalocal python3[7869]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:19:38 np0005540827.novalocal systemd[1]: Reloading.
Dec 01 09:19:38 np0005540827.novalocal systemd-rc-local-generator[7889]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:19:38 np0005540827.novalocal sudo[7867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:39 np0005540827.novalocal sudo[7922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atwtseesdfcjkidrvdvydvjclwaleomb ; /usr/bin/python3'
Dec 01 09:19:39 np0005540827.novalocal sudo[7922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:39 np0005540827.novalocal python3[7924]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 01 09:19:39 np0005540827.novalocal sudo[7922]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:40 np0005540827.novalocal sudo[7948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqwvrhehkqfuhsrjhtrjhwbgwfshhvuv ; /usr/bin/python3'
Dec 01 09:19:40 np0005540827.novalocal sudo[7948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:40 np0005540827.novalocal python3[7950]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:40 np0005540827.novalocal sudo[7948]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:40 np0005540827.novalocal sudo[7976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jchwoguekvrabfqpgmybuogoomlwwnka ; /usr/bin/python3'
Dec 01 09:19:40 np0005540827.novalocal sudo[7976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:40 np0005540827.novalocal python3[7978]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:40 np0005540827.novalocal sudo[7976]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:40 np0005540827.novalocal sudo[8004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblccuwshsfeigmccwoxcrabkmdkwxjt ; /usr/bin/python3'
Dec 01 09:19:40 np0005540827.novalocal sudo[8004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:40 np0005540827.novalocal python3[8006]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:40 np0005540827.novalocal sudo[8004]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:41 np0005540827.novalocal sudo[8032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqygcwlvmdnjinttzwpvopncxbkjyadc ; /usr/bin/python3'
Dec 01 09:19:41 np0005540827.novalocal sudo[8032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:41 np0005540827.novalocal python3[8034]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:41 np0005540827.novalocal sudo[8032]: pam_unix(sudo:session): session closed for user root
Dec 01 09:19:41 np0005540827.novalocal python3[8061]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001ce3-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:19:42 np0005540827.novalocal python3[8091]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:19:44 np0005540827.novalocal sshd-session[7509]: Connection closed by 38.102.83.114 port 37662
Dec 01 09:19:44 np0005540827.novalocal sshd-session[7506]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:19:44 np0005540827.novalocal systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Dec 01 09:19:44 np0005540827.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 01 09:19:44 np0005540827.novalocal systemd[1]: session-5.scope: Consumed 4.368s CPU time.
Dec 01 09:19:44 np0005540827.novalocal systemd-logind[795]: Removed session 5.
Dec 01 09:19:46 np0005540827.novalocal sshd-session[8098]: Accepted publickey for zuul from 38.102.83.114 port 38064 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:19:46 np0005540827.novalocal systemd-logind[795]: New session 6 of user zuul.
Dec 01 09:19:46 np0005540827.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 01 09:19:46 np0005540827.novalocal sshd-session[8098]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:19:46 np0005540827.novalocal sudo[8125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywrnbnwvxwqwwevsfbgzdczrkvdvuon ; /usr/bin/python3'
Dec 01 09:19:46 np0005540827.novalocal sudo[8125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:19:46 np0005540827.novalocal python3[8127]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:19:51 np0005540827.novalocal sshd-session[8097]: Invalid user bitrix from 45.78.219.119 port 38478
Dec 01 09:19:53 np0005540827.novalocal sshd-session[8097]: Received disconnect from 45.78.219.119 port 38478:11: Bye Bye [preauth]
Dec 01 09:19:53 np0005540827.novalocal sshd-session[8097]: Disconnected from invalid user bitrix 45.78.219.119 port 38478 [preauth]
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:20:01 np0005540827.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:20:10 np0005540827.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:20:19 np0005540827.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:20:20 np0005540827.novalocal setsebool[8198]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 01 09:20:20 np0005540827.novalocal setsebool[8198]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 01 09:20:22 np0005540827.novalocal sshd-session[8204]: Invalid user pivpn from 102.213.183.66 port 51756
Dec 01 09:20:22 np0005540827.novalocal sshd-session[8204]: Received disconnect from 102.213.183.66 port 51756:11: Bye Bye [preauth]
Dec 01 09:20:22 np0005540827.novalocal sshd-session[8204]: Disconnected from invalid user pivpn 102.213.183.66 port 51756 [preauth]
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:20:30 np0005540827.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:20:48 np0005540827.novalocal dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 01 09:20:48 np0005540827.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:20:48 np0005540827.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:20:48 np0005540827.novalocal systemd[1]: Reloading.
Dec 01 09:20:48 np0005540827.novalocal systemd-rc-local-generator[8957]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:20:48 np0005540827.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:20:51 np0005540827.novalocal sudo[8125]: pam_unix(sudo:session): session closed for user root
Dec 01 09:20:56 np0005540827.novalocal irqbalance[789]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 01 09:20:56 np0005540827.novalocal irqbalance[789]: IRQ 27 affinity is now unmanaged
Dec 01 09:21:00 np0005540827.novalocal python3[15557]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-8f55-108f-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:21:01 np0005540827.novalocal kernel: evm: overlay not supported
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: Starting D-Bus User Message Bus...
Dec 01 09:21:01 np0005540827.novalocal dbus-broker-launch[16118]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 01 09:21:01 np0005540827.novalocal dbus-broker-launch[16118]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: Started D-Bus User Message Bus.
Dec 01 09:21:01 np0005540827.novalocal dbus-broker-lau[16118]: Ready
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: Created slice Slice /user.
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: podman-16034.scope: unit configures an IP firewall, but not running as root.
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: Started podman-16034.scope.
Dec 01 09:21:01 np0005540827.novalocal systemd[4302]: Started podman-pause-89fb9cfd.scope.
Dec 01 09:21:02 np0005540827.novalocal sudo[16395]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmihopnijvcochxpcjbgvumkrpyeaaer ; /usr/bin/python3'
Dec 01 09:21:02 np0005540827.novalocal sudo[16395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:02 np0005540827.novalocal python3[16404]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.51:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.51:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:02 np0005540827.novalocal python3[16404]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 01 09:21:02 np0005540827.novalocal sudo[16395]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:03 np0005540827.novalocal sshd-session[8101]: Connection closed by 38.102.83.114 port 38064
Dec 01 09:21:03 np0005540827.novalocal sshd-session[8098]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:21:03 np0005540827.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 01 09:21:03 np0005540827.novalocal systemd[1]: session-6.scope: Consumed 58.245s CPU time.
Dec 01 09:21:03 np0005540827.novalocal systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Dec 01 09:21:03 np0005540827.novalocal systemd-logind[795]: Removed session 6.
Dec 01 09:21:22 np0005540827.novalocal sshd-session[24181]: Connection closed by 38.102.83.143 port 49944 [preauth]
Dec 01 09:21:22 np0005540827.novalocal sshd-session[24186]: Connection closed by 38.102.83.143 port 49958 [preauth]
Dec 01 09:21:22 np0005540827.novalocal sshd-session[24176]: Unable to negotiate with 38.102.83.143 port 49966: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 01 09:21:22 np0005540827.novalocal sshd-session[24180]: Unable to negotiate with 38.102.83.143 port 49974: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 01 09:21:22 np0005540827.novalocal sshd-session[24183]: Unable to negotiate with 38.102.83.143 port 49984: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 01 09:21:24 np0005540827.novalocal sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 7504
Dec 01 09:21:27 np0005540827.novalocal sshd-session[26162]: Accepted publickey for zuul from 38.102.83.114 port 37138 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:21:27 np0005540827.novalocal systemd-logind[795]: New session 7 of user zuul.
Dec 01 09:21:27 np0005540827.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 01 09:21:27 np0005540827.novalocal sshd-session[26162]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:21:28 np0005540827.novalocal python3[26288]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:21:28 np0005540827.novalocal sshd-session[26116]: Invalid user socks from 102.213.183.66 port 37028
Dec 01 09:21:28 np0005540827.novalocal sudo[26543]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puafxmmjzlkfgcnjnduxljtdmdvoenyc ; /usr/bin/python3'
Dec 01 09:21:28 np0005540827.novalocal sudo[26543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:28 np0005540827.novalocal sshd-session[26116]: Received disconnect from 102.213.183.66 port 37028:11: Bye Bye [preauth]
Dec 01 09:21:28 np0005540827.novalocal sshd-session[26116]: Disconnected from invalid user socks 102.213.183.66 port 37028 [preauth]
Dec 01 09:21:28 np0005540827.novalocal python3[26553]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:21:28 np0005540827.novalocal sudo[26543]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:29 np0005540827.novalocal sudo[27072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxryvprotinnvnplybyomeokbmitxitn ; /usr/bin/python3'
Dec 01 09:21:29 np0005540827.novalocal sudo[27072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:29 np0005540827.novalocal python3[27084]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005540827.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 01 09:21:29 np0005540827.novalocal useradd[27183]: new group: name=cloud-admin, GID=1002
Dec 01 09:21:29 np0005540827.novalocal useradd[27183]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 01 09:21:30 np0005540827.novalocal sudo[27072]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:30 np0005540827.novalocal sudo[27404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpcctsbjcxhlwudobwjexqolajfmzplj ; /usr/bin/python3'
Dec 01 09:21:30 np0005540827.novalocal sudo[27404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:30 np0005540827.novalocal python3[27413]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 01 09:21:30 np0005540827.novalocal sudo[27404]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:30 np0005540827.novalocal sudo[27708]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikcdnbfzhazlblcuhnrmuedoaetmoqfk ; /usr/bin/python3'
Dec 01 09:21:30 np0005540827.novalocal sudo[27708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:32 np0005540827.novalocal python3[27717]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:21:32 np0005540827.novalocal sudo[27708]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:32 np0005540827.novalocal sudo[27916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbdaxxxpazjafekohuyxnjkwtmreogwo ; /usr/bin/python3'
Dec 01 09:21:32 np0005540827.novalocal sudo[27916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:32 np0005540827.novalocal python3[27926]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580890.5775568-170-74323184710794/source _original_basename=tmpvgpcza6z follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:21:32 np0005540827.novalocal sudo[27916]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:33 np0005540827.novalocal sudo[28026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbydqvqxkjijlfbequekvzoxwtvdtvqh ; /usr/bin/python3'
Dec 01 09:21:33 np0005540827.novalocal sudo[28026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:21:33 np0005540827.novalocal python3[28028]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Dec 01 09:21:33 np0005540827.novalocal systemd[1]: Starting Hostname Service...
Dec 01 09:21:33 np0005540827.novalocal systemd[1]: Started Hostname Service.
Dec 01 09:21:33 np0005540827.novalocal systemd-hostnamed[28036]: Changed pretty hostname to 'compute-2'
Dec 01 09:21:33 compute-2 systemd-hostnamed[28036]: Hostname set to <compute-2> (static)
Dec 01 09:21:33 compute-2 NetworkManager[7192]: <info>  [1764580893.8558] hostname: static hostname changed from "np0005540827.novalocal" to "compute-2"
Dec 01 09:21:33 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:21:33 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:21:33 compute-2 sudo[28026]: pam_unix(sudo:session): session closed for user root
Dec 01 09:21:34 compute-2 sshd-session[26225]: Connection closed by 38.102.83.114 port 37138
Dec 01 09:21:34 compute-2 sshd-session[26162]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:21:34 compute-2 systemd[1]: session-7.scope: Deactivated successfully.
Dec 01 09:21:34 compute-2 systemd[1]: session-7.scope: Consumed 2.180s CPU time.
Dec 01 09:21:34 compute-2 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Dec 01 09:21:34 compute-2 systemd-logind[795]: Removed session 7.
Dec 01 09:21:42 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:21:42 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:21:42 compute-2 systemd[1]: man-db-cache-update.service: Consumed 54.379s CPU time.
Dec 01 09:21:42 compute-2 systemd[1]: run-re1eb3b87938d498689963ac6d7d2d145.service: Deactivated successfully.
Dec 01 09:21:43 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:21:57 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 8172
Dec 01 09:22:03 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:22:23 compute-2 sshd[1008]: drop connection #2 from [14.22.89.30]:55338 on [38.102.83.236]:22 penalty: exceeded LoginGraceTime
Dec 01 09:22:37 compute-2 sshd-session[30016]: Received disconnect from 102.213.183.66 port 54536:11: Bye Bye [preauth]
Dec 01 09:22:37 compute-2 sshd-session[30016]: Disconnected from authenticating user root 102.213.183.66 port 54536 [preauth]
Dec 01 09:22:50 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 9178
Dec 01 09:22:52 compute-2 sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 11093
Dec 01 09:22:54 compute-2 sshd-session[30018]: Received disconnect from 45.78.219.119 port 49816:11: Bye Bye [preauth]
Dec 01 09:22:54 compute-2 sshd-session[30018]: Disconnected from authenticating user root 45.78.219.119 port 49816 [preauth]
Dec 01 09:23:47 compute-2 sshd-session[30022]: Invalid user ts1 from 102.213.183.66 port 37180
Dec 01 09:23:47 compute-2 sshd-session[30022]: Received disconnect from 102.213.183.66 port 37180:11: Bye Bye [preauth]
Dec 01 09:23:47 compute-2 sshd-session[30022]: Disconnected from invalid user ts1 102.213.183.66 port 37180 [preauth]
Dec 01 09:24:12 compute-2 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 01 09:24:12 compute-2 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 01 09:24:12 compute-2 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 01 09:24:12 compute-2 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 01 09:24:51 compute-2 sshd-session[30028]: Connection closed by 106.75.87.38 port 43042
Dec 01 09:24:57 compute-2 sshd-session[30031]: Received disconnect from 102.213.183.66 port 50382:11: Bye Bye [preauth]
Dec 01 09:24:57 compute-2 sshd-session[30031]: Disconnected from authenticating user root 102.213.183.66 port 50382 [preauth]
Dec 01 09:25:05 compute-2 sshd-session[30029]: Connection closed by 106.75.87.38 port 43570 [preauth]
Dec 01 09:25:51 compute-2 sshd-session[30036]: Accepted publickey for zuul from 38.102.83.143 port 48040 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:25:51 compute-2 systemd-logind[795]: New session 8 of user zuul.
Dec 01 09:25:51 compute-2 systemd[1]: Started Session 8 of User zuul.
Dec 01 09:25:51 compute-2 sshd-session[30036]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:25:51 compute-2 python3[30112]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:25:53 compute-2 sudo[30226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqbnhprfgrkhcvwjketepymsrlvempid ; /usr/bin/python3'
Dec 01 09:25:53 compute-2 sudo[30226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:53 compute-2 python3[30228]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:53 compute-2 sudo[30226]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:54 compute-2 sudo[30299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mazzfhiicqttnspbztdugomiudzgpnuc ; /usr/bin/python3'
Dec 01 09:25:54 compute-2 sudo[30299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:54 compute-2 python3[30301]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:54 compute-2 sudo[30299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:54 compute-2 sudo[30325]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcypwhwthipkpklfgjwknfobxmloiory ; /usr/bin/python3'
Dec 01 09:25:54 compute-2 sudo[30325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:54 compute-2 python3[30327]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:54 compute-2 sudo[30325]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:54 compute-2 sudo[30398]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrrxtbxhrjplzwcclatjckhpxoqdxabz ; /usr/bin/python3'
Dec 01 09:25:54 compute-2 sudo[30398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:55 compute-2 python3[30400]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:55 compute-2 sudo[30398]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-2 sudo[30424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-misacwqzakvuecswwiccwracrzalqbcq ; /usr/bin/python3'
Dec 01 09:25:55 compute-2 sudo[30424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:55 compute-2 python3[30426]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:55 compute-2 sudo[30424]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-2 sudo[30497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnearjqqnuglhzzastynvhhnpauqkwrx ; /usr/bin/python3'
Dec 01 09:25:55 compute-2 sudo[30497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:55 compute-2 python3[30499]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:55 compute-2 sudo[30497]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:55 compute-2 sudo[30523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecysiwjecrvsmipiojspxjbnhsefjgcq ; /usr/bin/python3'
Dec 01 09:25:55 compute-2 sudo[30523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:55 compute-2 python3[30525]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:55 compute-2 sudo[30523]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:56 compute-2 sudo[30596]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvvkmrxuxqfsedtphptlkjupzfnmmjdq ; /usr/bin/python3'
Dec 01 09:25:56 compute-2 sudo[30596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:56 compute-2 python3[30598]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:56 compute-2 sudo[30596]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:56 compute-2 sudo[30622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-debyjcgoftggiimvoxcvmksmodcinfpr ; /usr/bin/python3'
Dec 01 09:25:56 compute-2 sudo[30622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:56 compute-2 python3[30624]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:56 compute-2 sudo[30622]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:56 compute-2 sudo[30695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcdswqspdueerzdbsaivhsbqxyqpkdpo ; /usr/bin/python3'
Dec 01 09:25:56 compute-2 sudo[30695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:57 compute-2 python3[30697]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:57 compute-2 sudo[30695]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-2 sudo[30721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hglsildqzlvuyorejgjcusmvelcrjrwp ; /usr/bin/python3'
Dec 01 09:25:57 compute-2 sudo[30721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:57 compute-2 python3[30723]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:57 compute-2 sudo[30721]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-2 sudo[30794]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drqhfkrpjftgpituspukqxhfivoslyoh ; /usr/bin/python3'
Dec 01 09:25:57 compute-2 sudo[30794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:57 compute-2 python3[30796]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:57 compute-2 sudo[30794]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:57 compute-2 sudo[30820]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxaldoushnpejncqealkaqysdacgivhh ; /usr/bin/python3'
Dec 01 09:25:57 compute-2 sudo[30820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:58 compute-2 python3[30822]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:25:58 compute-2 sudo[30820]: pam_unix(sudo:session): session closed for user root
Dec 01 09:25:58 compute-2 sudo[30893]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jappmeiljpttixsvmlhpxmcsmyzwwvto ; /usr/bin/python3'
Dec 01 09:25:58 compute-2 sudo[30893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:25:58 compute-2 python3[30895]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:25:58 compute-2 sudo[30893]: pam_unix(sudo:session): session closed for user root
Dec 01 09:26:11 compute-2 python3[30943]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:26:12 compute-2 sshd-session[30945]: Invalid user valheim from 102.213.183.66 port 52970
Dec 01 09:26:12 compute-2 sshd-session[30945]: Received disconnect from 102.213.183.66 port 52970:11: Bye Bye [preauth]
Dec 01 09:26:12 compute-2 sshd-session[30945]: Disconnected from invalid user valheim 102.213.183.66 port 52970 [preauth]
Dec 01 09:27:21 compute-2 systemd[1]: Starting dnf makecache...
Dec 01 09:27:21 compute-2 sshd-session[30948]: Invalid user jrodrig from 102.213.183.66 port 46514
Dec 01 09:27:21 compute-2 sshd-session[30948]: Received disconnect from 102.213.183.66 port 46514:11: Bye Bye [preauth]
Dec 01 09:27:21 compute-2 sshd-session[30948]: Disconnected from invalid user jrodrig 102.213.183.66 port 46514 [preauth]
Dec 01 09:27:21 compute-2 dnf[30950]: Failed determining last makecache time.
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-barbican-42b4c41831408a8e323 403 kB/s |  13 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.7 MB/s |  65 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-python-stevedore-c4acc5639fd2329372142 4.6 MB/s | 131 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.2 MB/s |  32 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  13 MB/s | 349 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.9 MB/s |  42 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-python-designate-tests-tempest-347fdbc 718 kB/s |  18 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-glance-1fd12c29b339f30fe823e 688 kB/s |  18 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.1 MB/s |  29 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-manila-3c01b7181572c95dac462 1.1 MB/s |  25 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-python-whitebox-neutron-tests-tempest- 5.5 MB/s | 154 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-octavia-ba397f07a7331190208c 1.3 MB/s |  26 kB     00:00
Dec 01 09:27:21 compute-2 dnf[30950]: delorean-openstack-watcher-c014f81a8647287f6dcc 839 kB/s |  16 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: delorean-ansible-config_template-5ccaa22121a7ff 392 kB/s | 7.4 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.5 MB/s | 144 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: delorean-openstack-swift-dc98a8463506ac520c469a 579 kB/s |  14 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: delorean-python-tempestconf-8515371b7cceebd4282 1.9 MB/s |  53 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.7 MB/s |  96 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: CentOS Stream 9 - BaseOS                         29 kB/s | 7.3 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: CentOS Stream 9 - AppStream                      71 kB/s | 7.4 kB     00:00
Dec 01 09:27:22 compute-2 dnf[30950]: CentOS Stream 9 - CRB                            31 kB/s | 7.2 kB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: CentOS Stream 9 - Extras packages                72 kB/s | 8.3 kB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: dlrn-antelope-build-deps                         18 MB/s | 461 kB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: centos9-rabbitmq                                8.2 MB/s | 123 kB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: centos9-storage                                  26 MB/s | 415 kB     00:00
Dec 01 09:27:23 compute-2 dnf[30950]: centos9-opstools                                4.5 MB/s |  51 kB     00:00
Dec 01 09:27:24 compute-2 dnf[30950]: NFV SIG OpenvSwitch                              21 MB/s | 456 kB     00:00
Dec 01 09:27:24 compute-2 dnf[30950]: repo-setup-centos-appstream                      91 MB/s |  25 MB     00:00
Dec 01 09:27:25 compute-2 sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 30033
Dec 01 09:27:30 compute-2 dnf[30950]: repo-setup-centos-baseos                         80 MB/s | 8.8 MB     00:00
Dec 01 09:27:31 compute-2 dnf[30950]: repo-setup-centos-highavailability               28 MB/s | 744 kB     00:00
Dec 01 09:27:31 compute-2 dnf[30950]: repo-setup-centos-powertools                     72 MB/s | 7.3 MB     00:00
Dec 01 09:27:35 compute-2 dnf[30950]: Extra Packages for Enterprise Linux 9 - x86_64   13 MB/s |  20 MB     00:01
Dec 01 09:27:47 compute-2 dnf[30950]: Metadata cache created.
Dec 01 09:27:47 compute-2 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 01 09:27:47 compute-2 systemd[1]: Finished dnf makecache.
Dec 01 09:27:47 compute-2 systemd[1]: dnf-makecache.service: Consumed 23.186s CPU time.
Dec 01 09:28:16 compute-2 sshd-session[31052]: Invalid user rahul from 45.78.219.119 port 35982
Dec 01 09:28:16 compute-2 sshd-session[31052]: Received disconnect from 45.78.219.119 port 35982:11: Bye Bye [preauth]
Dec 01 09:28:16 compute-2 sshd-session[31052]: Disconnected from invalid user rahul 45.78.219.119 port 35982 [preauth]
Dec 01 09:28:32 compute-2 sshd-session[31054]: Invalid user kapsch from 102.213.183.66 port 36274
Dec 01 09:28:32 compute-2 sshd-session[31054]: Received disconnect from 102.213.183.66 port 36274:11: Bye Bye [preauth]
Dec 01 09:28:32 compute-2 sshd-session[31054]: Disconnected from invalid user kapsch 102.213.183.66 port 36274 [preauth]
Dec 01 09:29:46 compute-2 sshd-session[31056]: Invalid user foundry from 102.213.183.66 port 52988
Dec 01 09:29:46 compute-2 sshd-session[31056]: Received disconnect from 102.213.183.66 port 52988:11: Bye Bye [preauth]
Dec 01 09:29:46 compute-2 sshd-session[31056]: Disconnected from invalid user foundry 102.213.183.66 port 52988 [preauth]
Dec 01 09:29:59 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 31051
Dec 01 09:30:55 compute-2 sshd-session[31062]: Invalid user syncuser from 102.213.183.66 port 42164
Dec 01 09:30:56 compute-2 sshd-session[31062]: Received disconnect from 102.213.183.66 port 42164:11: Bye Bye [preauth]
Dec 01 09:30:56 compute-2 sshd-session[31062]: Disconnected from invalid user syncuser 102.213.183.66 port 42164 [preauth]
Dec 01 09:31:00 compute-2 sshd-session[31060]: Received disconnect from 45.78.219.119 port 41930:11: Bye Bye [preauth]
Dec 01 09:31:00 compute-2 sshd-session[31060]: Disconnected from 45.78.219.119 port 41930 [preauth]
Dec 01 09:31:10 compute-2 sshd-session[30039]: Received disconnect from 38.102.83.143 port 48040:11: disconnected by user
Dec 01 09:31:10 compute-2 sshd-session[30039]: Disconnected from user zuul 38.102.83.143 port 48040
Dec 01 09:31:10 compute-2 sshd-session[30036]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:31:10 compute-2 systemd[1]: session-8.scope: Deactivated successfully.
Dec 01 09:31:10 compute-2 systemd[1]: session-8.scope: Consumed 5.566s CPU time.
Dec 01 09:31:10 compute-2 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Dec 01 09:31:10 compute-2 systemd-logind[795]: Removed session 8.
Dec 01 09:31:51 compute-2 sshd-session[31067]: Connection closed by 172.104.11.51 port 48880 [preauth]
Dec 01 09:31:52 compute-2 sshd-session[31069]: Connection closed by 172.104.11.51 port 48892 [preauth]
Dec 01 09:31:52 compute-2 sshd-session[31071]: Unable to negotiate with 172.104.11.51 port 48896: no matching host key type found. Their offer: ssh-ed25519-cert-v01@openssh.com,ssh-ed25519 [preauth]
Dec 01 09:32:01 compute-2 sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 31059
Dec 01 09:32:09 compute-2 sshd-session[31073]: Received disconnect from 102.213.183.66 port 36976:11: Bye Bye [preauth]
Dec 01 09:32:09 compute-2 sshd-session[31073]: Disconnected from authenticating user root 102.213.183.66 port 36976 [preauth]
Dec 01 09:33:00 compute-2 sshd[1008]: drop connection #2 from [14.22.89.30]:42494 on [38.102.83.236]:22 penalty: exceeded LoginGraceTime
Dec 01 09:33:15 compute-2 sshd-session[31075]: Received disconnect from 102.213.183.66 port 48412:11: Bye Bye [preauth]
Dec 01 09:33:15 compute-2 sshd-session[31075]: Disconnected from authenticating user root 102.213.183.66 port 48412 [preauth]
Dec 01 09:33:30 compute-2 sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 31065
Dec 01 09:33:32 compute-2 sshd-session[31078]: Received disconnect from 45.78.219.119 port 41394:11: Bye Bye [preauth]
Dec 01 09:33:32 compute-2 sshd-session[31078]: Disconnected from authenticating user root 45.78.219.119 port 41394 [preauth]
Dec 01 09:33:38 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 31066
Dec 01 09:34:23 compute-2 sshd-session[31081]: Invalid user hu from 102.213.183.66 port 47836
Dec 01 09:34:23 compute-2 sshd-session[31081]: Received disconnect from 102.213.183.66 port 47836:11: Bye Bye [preauth]
Dec 01 09:34:23 compute-2 sshd-session[31081]: Disconnected from invalid user hu 102.213.183.66 port 47836 [preauth]
Dec 01 09:35:29 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 31080
Dec 01 09:35:33 compute-2 sshd-session[31084]: Received disconnect from 102.213.183.66 port 48310:11: Bye Bye [preauth]
Dec 01 09:35:33 compute-2 sshd-session[31084]: Disconnected from authenticating user root 102.213.183.66 port 48310 [preauth]
Dec 01 09:36:37 compute-2 sshd[1008]: Timeout before authentication for connection from 114.80.32.225 to 38.102.83.236, pid = 31083
Dec 01 09:36:47 compute-2 sshd-session[31089]: Invalid user user2 from 102.213.183.66 port 53350
Dec 01 09:36:47 compute-2 sshd-session[31089]: Received disconnect from 102.213.183.66 port 53350:11: Bye Bye [preauth]
Dec 01 09:36:47 compute-2 sshd-session[31089]: Disconnected from invalid user user2 102.213.183.66 port 53350 [preauth]
Dec 01 09:37:44 compute-2 sshd-session[31092]: Invalid user bitrix from 14.22.89.30 port 33280
Dec 01 09:37:44 compute-2 sshd-session[31092]: Received disconnect from 14.22.89.30 port 33280:11: Bye Bye [preauth]
Dec 01 09:37:44 compute-2 sshd-session[31092]: Disconnected from invalid user bitrix 14.22.89.30 port 33280 [preauth]
Dec 01 09:37:56 compute-2 sshd-session[31095]: Invalid user tecnopos from 102.213.183.66 port 38792
Dec 01 09:37:56 compute-2 sshd-session[31095]: Received disconnect from 102.213.183.66 port 38792:11: Bye Bye [preauth]
Dec 01 09:37:56 compute-2 sshd-session[31095]: Disconnected from invalid user tecnopos 102.213.183.66 port 38792 [preauth]
Dec 01 09:38:03 compute-2 sshd-session[31097]: Accepted publickey for zuul from 192.168.122.30 port 36842 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:38:03 compute-2 systemd-logind[795]: New session 9 of user zuul.
Dec 01 09:38:03 compute-2 systemd[1]: Started Session 9 of User zuul.
Dec 01 09:38:03 compute-2 sshd-session[31097]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:38:04 compute-2 python3.9[31250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:38:06 compute-2 sudo[31429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmctdymdpjiipnwfqknjdaaybvywgrjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581885.7763996-59-269272009781018/AnsiballZ_command.py'
Dec 01 09:38:06 compute-2 sudo[31429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:06 compute-2 python3.9[31431]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:38:13 compute-2 sshd[1008]: Timeout before authentication for connection from 14.22.89.30 to 38.102.83.236, pid = 31086
Dec 01 09:38:13 compute-2 sudo[31429]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:14 compute-2 sshd-session[31100]: Connection closed by 192.168.122.30 port 36842
Dec 01 09:38:14 compute-2 sshd-session[31097]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:38:14 compute-2 systemd[1]: session-9.scope: Deactivated successfully.
Dec 01 09:38:14 compute-2 systemd[1]: session-9.scope: Consumed 7.677s CPU time.
Dec 01 09:38:14 compute-2 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Dec 01 09:38:14 compute-2 systemd-logind[795]: Removed session 9.
Dec 01 09:38:29 compute-2 sshd-session[31488]: Accepted publickey for zuul from 192.168.122.30 port 53678 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:38:29 compute-2 systemd-logind[795]: New session 10 of user zuul.
Dec 01 09:38:29 compute-2 systemd[1]: Started Session 10 of User zuul.
Dec 01 09:38:29 compute-2 sshd-session[31488]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:38:30 compute-2 python3.9[31641]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 01 09:38:31 compute-2 python3.9[31815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:38:32 compute-2 sudo[31965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxotoepqdruuzbatufrmpywjxhmqjojy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581912.0659833-96-21257906950/AnsiballZ_command.py'
Dec 01 09:38:32 compute-2 sudo[31965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:32 compute-2 python3.9[31967]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:38:32 compute-2 sudo[31965]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:33 compute-2 sudo[32120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hccqgepaimxsfndbyzcigufometcrakn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581913.1885643-132-25204353719266/AnsiballZ_stat.py'
Dec 01 09:38:33 compute-2 sudo[32120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:33 compute-2 python3.9[32122]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:38:33 compute-2 sudo[32120]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:34 compute-2 sudo[32272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbsblsctnbkvbsodmqobifzkctannfvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581914.0901382-156-182989947292821/AnsiballZ_file.py'
Dec 01 09:38:34 compute-2 sudo[32272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:34 compute-2 python3.9[32274]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:38:34 compute-2 sudo[32272]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:35 compute-2 sudo[32424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvjwlvsckrhhsgwzfsxjbzlcexpxgxub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581915.0103467-179-21745468106705/AnsiballZ_stat.py'
Dec 01 09:38:35 compute-2 sudo[32424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:35 compute-2 python3.9[32426]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:38:35 compute-2 sudo[32424]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:35 compute-2 sudo[32547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyfoebxzwpgxytqvzfmrqoffbztbsqpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581915.0103467-179-21745468106705/AnsiballZ_copy.py'
Dec 01 09:38:35 compute-2 sudo[32547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:36 compute-2 python3.9[32549]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581915.0103467-179-21745468106705/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:38:36 compute-2 sudo[32547]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:36 compute-2 sudo[32699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbgajnrkzazjjsjjprqllmukmimotkha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581916.4740415-224-264329806729825/AnsiballZ_setup.py'
Dec 01 09:38:36 compute-2 sudo[32699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:37 compute-2 python3.9[32701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:38:37 compute-2 sudo[32699]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:37 compute-2 sudo[32855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwkfkgdepimhrsvxnqsvytrsczhrktcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581917.5870912-249-116302685936018/AnsiballZ_file.py'
Dec 01 09:38:37 compute-2 sudo[32855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:38 compute-2 python3.9[32857]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:38:38 compute-2 sudo[32855]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:38 compute-2 sudo[33007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugabgzsgxpofkscpuexvsqiwnnyulve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581918.4360824-275-262706027511634/AnsiballZ_file.py'
Dec 01 09:38:38 compute-2 sudo[33007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:38 compute-2 python3.9[33009]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:38:38 compute-2 sudo[33007]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:39 compute-2 sshd-session[31993]: Received disconnect from 45.78.219.119 port 43716:11: Bye Bye [preauth]
Dec 01 09:38:39 compute-2 sshd-session[31993]: Disconnected from authenticating user root 45.78.219.119 port 43716 [preauth]
Dec 01 09:38:39 compute-2 python3.9[33159]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:38:45 compute-2 python3.9[33412]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:38:45 compute-2 python3.9[33562]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:38:47 compute-2 python3.9[33716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:38:48 compute-2 sudo[33872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juzcplohyfhxkrgxgwvvcoruxbvvvbcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581927.8853097-419-59553024934786/AnsiballZ_setup.py'
Dec 01 09:38:48 compute-2 sudo[33872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:48 compute-2 python3.9[33874]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:38:48 compute-2 sudo[33872]: pam_unix(sudo:session): session closed for user root
Dec 01 09:38:49 compute-2 sudo[33958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nergvnwbndghfttxtcgdtzjcrwrqinlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764581927.8853097-419-59553024934786/AnsiballZ_dnf.py'
Dec 01 09:38:49 compute-2 sudo[33958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:38:49 compute-2 python3.9[33960]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:38:50 compute-2 sshd-session[33906]: Invalid user grid from 222.73.135.87 port 33916
Dec 01 09:38:50 compute-2 sshd-session[33906]: Received disconnect from 222.73.135.87 port 33916:11: Bye Bye [preauth]
Dec 01 09:38:50 compute-2 sshd-session[33906]: Disconnected from invalid user grid 222.73.135.87 port 33916 [preauth]
Dec 01 09:39:03 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 31091
Dec 01 09:39:04 compute-2 sshd-session[34050]: Invalid user dixi from 102.213.183.66 port 37232
Dec 01 09:39:04 compute-2 sshd-session[34050]: Received disconnect from 102.213.183.66 port 37232:11: Bye Bye [preauth]
Dec 01 09:39:04 compute-2 sshd-session[34050]: Disconnected from invalid user dixi 102.213.183.66 port 37232 [preauth]
Dec 01 09:39:19 compute-2 sshd[1008]: drop connection #1 from [14.22.89.30]:45438 on [38.102.83.236]:22 penalty: exceeded LoginGraceTime
Dec 01 09:39:30 compute-2 systemd[1]: Reloading.
Dec 01 09:39:30 compute-2 systemd-rc-local-generator[34161]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:39:30 compute-2 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 01 09:39:31 compute-2 systemd[1]: Reloading.
Dec 01 09:39:31 compute-2 systemd-rc-local-generator[34202]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:39:31 compute-2 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 01 09:39:31 compute-2 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 01 09:39:31 compute-2 systemd[1]: Reloading.
Dec 01 09:39:31 compute-2 systemd-rc-local-generator[34241]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:39:31 compute-2 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 01 09:39:31 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 09:39:31 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 09:39:31 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 09:39:56 compute-2 sshd[1008]: Timeout before authentication for connection from 222.73.135.87 to 38.102.83.236, pid = 31094
Dec 01 09:40:13 compute-2 sshd-session[34445]: Invalid user tony from 102.213.183.66 port 38074
Dec 01 09:40:13 compute-2 sshd-session[34445]: Received disconnect from 102.213.183.66 port 38074:11: Bye Bye [preauth]
Dec 01 09:40:13 compute-2 sshd-session[34445]: Disconnected from invalid user tony 102.213.183.66 port 38074 [preauth]
Dec 01 09:40:36 compute-2 kernel: SELinux:  Converting 2719 SID table entries...
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:40:36 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:40:36 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 01 09:40:36 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:40:36 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:40:36 compute-2 systemd[1]: Reloading.
Dec 01 09:40:36 compute-2 systemd-rc-local-generator[34596]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:40:36 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:40:37 compute-2 sudo[33958]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:37 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:40:37 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:40:37 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.031s CPU time.
Dec 01 09:40:37 compute-2 systemd[1]: run-r56655424d6864b909ed91ade5e5c84da.service: Deactivated successfully.
Dec 01 09:40:38 compute-2 sudo[35513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blfffovyarywonzmemzzjjponqdeulqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582038.468721-456-35907796273481/AnsiballZ_command.py'
Dec 01 09:40:38 compute-2 sudo[35513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:38 compute-2 python3.9[35515]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:40:39 compute-2 sudo[35513]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:40 compute-2 sudo[35794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyypswzznvarrvfdrxgqntapkcpxtuxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582040.1599393-481-161531873017439/AnsiballZ_selinux.py'
Dec 01 09:40:40 compute-2 sudo[35794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:41 compute-2 python3.9[35796]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 01 09:40:41 compute-2 sudo[35794]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:41 compute-2 sudo[35946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jepihoxohkcnqfqczkdlxfelvqvyynnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582041.6726441-513-139477534527247/AnsiballZ_command.py'
Dec 01 09:40:41 compute-2 sudo[35946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:42 compute-2 python3.9[35948]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 01 09:40:43 compute-2 sudo[35946]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:44 compute-2 sudo[36099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozrhdyiuhndaiwtghmfyvicnetnkkrbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582043.9587533-537-229683660292366/AnsiballZ_file.py'
Dec 01 09:40:44 compute-2 sudo[36099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:44 compute-2 python3.9[36101]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:40:45 compute-2 sudo[36099]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:45 compute-2 sudo[36251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnzbmxlbmxgimzjoibqzwolsfqzlqmwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582045.2293603-562-203176529548229/AnsiballZ_mount.py'
Dec 01 09:40:45 compute-2 sudo[36251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:45 compute-2 python3.9[36253]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 01 09:40:45 compute-2 sudo[36251]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:47 compute-2 sudo[36403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsqhpgtqzumojhtcmsfmsnuecajwninj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582046.8120155-645-206486567415908/AnsiballZ_file.py'
Dec 01 09:40:47 compute-2 sudo[36403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:47 compute-2 python3.9[36405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:40:47 compute-2 sudo[36403]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:47 compute-2 sudo[36555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpusspwobhadhqmoyuufzshinnbvzrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582047.5746458-669-225336485301809/AnsiballZ_stat.py'
Dec 01 09:40:47 compute-2 sudo[36555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:48 compute-2 python3.9[36557]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:40:48 compute-2 sudo[36555]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:48 compute-2 sudo[36678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzefiuryimyvzxbtduwtlxpljmlwlcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582047.5746458-669-225336485301809/AnsiballZ_copy.py'
Dec 01 09:40:48 compute-2 sudo[36678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:48 compute-2 python3.9[36680]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582047.5746458-669-225336485301809/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:40:48 compute-2 sudo[36678]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:52 compute-2 sudo[36830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxuxwperdkhfvvlogbcntzsvqezkjcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582052.063148-741-60951261235191/AnsiballZ_stat.py'
Dec 01 09:40:52 compute-2 sudo[36830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:52 compute-2 python3.9[36832]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:40:52 compute-2 sudo[36830]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:54 compute-2 sudo[36982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztbcndlczkdnckqppcdvzjpzkdvhdkeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582053.955698-765-64713935788087/AnsiballZ_command.py'
Dec 01 09:40:54 compute-2 sudo[36982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:54 compute-2 python3.9[36984]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:40:54 compute-2 sudo[36982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:56 compute-2 sudo[37135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmirytagczlxcpwnpwniqjfsyexuikll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582055.9365351-789-149531526170599/AnsiballZ_file.py'
Dec 01 09:40:56 compute-2 sudo[37135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:56 compute-2 python3.9[37137]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:40:56 compute-2 sudo[37135]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:57 compute-2 sudo[37287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-civxuttkxlclyxlbfoeervjkhdmqalyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582056.9494433-822-83378792948341/AnsiballZ_getent.py'
Dec 01 09:40:57 compute-2 sudo[37287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:57 compute-2 python3.9[37289]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 01 09:40:57 compute-2 sudo[37287]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:57 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:40:58 compute-2 sudo[37441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mglljvfkoxzehfzzvyvnqeqdpcrcrmzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582057.949083-845-102054468709548/AnsiballZ_group.py'
Dec 01 09:40:58 compute-2 sudo[37441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:58 compute-2 python3.9[37443]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:40:58 compute-2 groupadd[37444]: group added to /etc/group: name=qemu, GID=107
Dec 01 09:40:58 compute-2 groupadd[37444]: group added to /etc/gshadow: name=qemu
Dec 01 09:40:58 compute-2 groupadd[37444]: new group: name=qemu, GID=107
Dec 01 09:40:58 compute-2 sudo[37441]: pam_unix(sudo:session): session closed for user root
Dec 01 09:40:59 compute-2 sudo[37599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhnifkrryxsyuccfwqusiblgygfqdqyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582059.0101376-869-167527094041510/AnsiballZ_user.py'
Dec 01 09:40:59 compute-2 sudo[37599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:40:59 compute-2 python3.9[37601]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:40:59 compute-2 useradd[37603]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 09:40:59 compute-2 sudo[37599]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:00 compute-2 sudo[37759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzstxedwiflhmttsybxdrqnpezqztixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582060.3271093-894-7374393424221/AnsiballZ_getent.py'
Dec 01 09:41:00 compute-2 sudo[37759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:00 compute-2 python3.9[37761]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 01 09:41:00 compute-2 sudo[37759]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:01 compute-2 sudo[37912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtfcuhxsmbgmeirlwcrletjahakcznp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582061.237932-918-7008363514896/AnsiballZ_group.py'
Dec 01 09:41:01 compute-2 sudo[37912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:01 compute-2 python3.9[37914]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:41:01 compute-2 groupadd[37915]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 01 09:41:01 compute-2 groupadd[37915]: group added to /etc/gshadow: name=hugetlbfs
Dec 01 09:41:01 compute-2 groupadd[37915]: new group: name=hugetlbfs, GID=42477
Dec 01 09:41:01 compute-2 sudo[37912]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:02 compute-2 sudo[38070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qalrugifiuqrxhrhbffdqyiycbloxjnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582062.1606638-944-45777647022671/AnsiballZ_file.py'
Dec 01 09:41:02 compute-2 sudo[38070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:02 compute-2 python3.9[38072]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 01 09:41:02 compute-2 sudo[38070]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:03 compute-2 sudo[38222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxegonxwrthpfjzhogtxaslitmppwicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582063.14479-978-430626479953/AnsiballZ_dnf.py'
Dec 01 09:41:03 compute-2 sudo[38222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:03 compute-2 python3.9[38224]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:41:05 compute-2 sudo[38222]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:06 compute-2 sudo[38377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzqucyazulsefxakgbulahgowwocwsdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582065.7571774-1002-57559560594849/AnsiballZ_file.py'
Dec 01 09:41:06 compute-2 sudo[38377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:06 compute-2 python3.9[38379]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:41:06 compute-2 sudo[38377]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:06 compute-2 sudo[38529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foynnjkseiwkkgvlimijlssjzxabhzwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582066.5362132-1026-235503535381085/AnsiballZ_stat.py'
Dec 01 09:41:06 compute-2 sudo[38529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:07 compute-2 python3.9[38531]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:41:07 compute-2 sudo[38529]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:07 compute-2 sudo[38652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuqhkevcwzydbeewowaogcjddglvbqsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582066.5362132-1026-235503535381085/AnsiballZ_copy.py'
Dec 01 09:41:07 compute-2 sudo[38652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:07 compute-2 python3.9[38654]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582066.5362132-1026-235503535381085/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:41:07 compute-2 sudo[38652]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:08 compute-2 sshd-session[38226]: Invalid user minecraft from 45.78.219.119 port 54474
Dec 01 09:41:08 compute-2 sshd-session[38226]: Received disconnect from 45.78.219.119 port 54474:11: Bye Bye [preauth]
Dec 01 09:41:08 compute-2 sshd-session[38226]: Disconnected from invalid user minecraft 45.78.219.119 port 54474 [preauth]
Dec 01 09:41:08 compute-2 sudo[38804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpzsmimdhmjbmyigvbumsdfryoxqucfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582067.961366-1072-186190623237255/AnsiballZ_systemd.py'
Dec 01 09:41:08 compute-2 sudo[38804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:08 compute-2 python3.9[38806]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:41:08 compute-2 systemd[1]: Starting Load Kernel Modules...
Dec 01 09:41:08 compute-2 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 01 09:41:08 compute-2 kernel: Bridge firewalling registered
Dec 01 09:41:08 compute-2 systemd-modules-load[38810]: Inserted module 'br_netfilter'
Dec 01 09:41:08 compute-2 systemd[1]: Finished Load Kernel Modules.
Dec 01 09:41:08 compute-2 sudo[38804]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:09 compute-2 sudo[38963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjpkbciqmzsfgprmxcmgtfpstopfozp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582069.3216593-1095-101046730846442/AnsiballZ_stat.py'
Dec 01 09:41:09 compute-2 sudo[38963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:09 compute-2 python3.9[38965]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:41:09 compute-2 sudo[38963]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:10 compute-2 sudo[39086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aihgclnoivjygdauselzgsfggjpatwns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582069.3216593-1095-101046730846442/AnsiballZ_copy.py'
Dec 01 09:41:10 compute-2 sudo[39086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:10 compute-2 python3.9[39088]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582069.3216593-1095-101046730846442/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:41:10 compute-2 sudo[39086]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:11 compute-2 sudo[39238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsrguxogixkcbcqmhaalfonkwvzeqoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582071.0401106-1149-14232277901389/AnsiballZ_dnf.py'
Dec 01 09:41:11 compute-2 sudo[39238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:11 compute-2 python3.9[39240]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:41:15 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 09:41:15 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 09:41:15 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:41:15 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:41:15 compute-2 systemd[1]: Reloading.
Dec 01 09:41:16 compute-2 systemd-rc-local-generator[39304]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:41:16 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:41:16 compute-2 sudo[39238]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:17 compute-2 python3.9[40801]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:41:18 compute-2 python3.9[42469]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 01 09:41:19 compute-2 python3.9[43148]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:41:19 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:41:19 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:41:19 compute-2 systemd[1]: man-db-cache-update.service: Consumed 4.694s CPU time.
Dec 01 09:41:19 compute-2 systemd[1]: run-rbab968eb945b4ea4a3d08c35e5fa84f7.service: Deactivated successfully.
Dec 01 09:41:20 compute-2 sudo[43398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnyjtinwnrhylytmmutgetlvmzkopjgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582080.0167363-1266-184200852442857/AnsiballZ_command.py'
Dec 01 09:41:20 compute-2 sudo[43398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:20 compute-2 python3.9[43400]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:41:20 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:41:20 compute-2 systemd[1]: Starting Authorization Manager...
Dec 01 09:41:20 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:41:21 compute-2 polkitd[43617]: Started polkitd version 0.117
Dec 01 09:41:21 compute-2 polkitd[43617]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 09:41:21 compute-2 polkitd[43617]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 09:41:21 compute-2 polkitd[43617]: Finished loading, compiling and executing 2 rules
Dec 01 09:41:21 compute-2 systemd[1]: Started Authorization Manager.
Dec 01 09:41:21 compute-2 polkitd[43617]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 01 09:41:21 compute-2 sudo[43398]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:21 compute-2 sudo[43785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajbchudqrfvurvhvniwiowxbpzstgnxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582081.4362953-1294-7548404005695/AnsiballZ_systemd.py'
Dec 01 09:41:21 compute-2 sudo[43785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:22 compute-2 python3.9[43787]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:41:22 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 01 09:41:22 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Dec 01 09:41:22 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 01 09:41:22 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:41:22 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:41:22 compute-2 sudo[43785]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:23 compute-2 python3.9[43948]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 01 09:41:26 compute-2 sudo[44098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdenqrbgnigxtncgclafuclwdmeewtfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582086.4804094-1464-24379139165581/AnsiballZ_systemd.py'
Dec 01 09:41:26 compute-2 sudo[44098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:27 compute-2 python3.9[44100]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:41:27 compute-2 systemd[1]: Reloading.
Dec 01 09:41:27 compute-2 systemd-rc-local-generator[44132]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:41:27 compute-2 sudo[44098]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:27 compute-2 sudo[44287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqawgbriqgwsaztezuuxrcpxfvntxiov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582087.7113154-1464-101247186213676/AnsiballZ_systemd.py'
Dec 01 09:41:27 compute-2 sudo[44287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:28 compute-2 python3.9[44289]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:41:28 compute-2 systemd[1]: Reloading.
Dec 01 09:41:28 compute-2 systemd-rc-local-generator[44320]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:41:28 compute-2 sudo[44287]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:29 compute-2 sudo[44479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukiudzxjntvfazcrrfscoietrmrbhxhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582088.9961886-1511-29502741204155/AnsiballZ_command.py'
Dec 01 09:41:29 compute-2 sudo[44479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:29 compute-2 sshd-session[44352]: Invalid user hu from 102.213.183.66 port 48020
Dec 01 09:41:29 compute-2 python3.9[44481]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:41:29 compute-2 sudo[44479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:29 compute-2 sshd-session[44352]: Received disconnect from 102.213.183.66 port 48020:11: Bye Bye [preauth]
Dec 01 09:41:29 compute-2 sshd-session[44352]: Disconnected from invalid user hu 102.213.183.66 port 48020 [preauth]
Dec 01 09:41:30 compute-2 sudo[44632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swlwkxtmkquikytqjmzuvdpkjcxcrmpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582089.761227-1536-273173928205421/AnsiballZ_command.py'
Dec 01 09:41:30 compute-2 sudo[44632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:30 compute-2 python3.9[44634]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:41:30 compute-2 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 01 09:41:30 compute-2 sudo[44632]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:30 compute-2 sudo[44785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqeddntyqdxlxmeligdeanblhxtldyrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582090.427354-1560-22059738450714/AnsiballZ_command.py'
Dec 01 09:41:30 compute-2 sudo[44785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:31 compute-2 python3.9[44787]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:41:32 compute-2 sudo[44785]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:32 compute-2 sudo[44947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqxshfofcvzqgnzibuucxhvqimwiabvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582092.6140108-1584-111038695972361/AnsiballZ_command.py'
Dec 01 09:41:32 compute-2 sudo[44947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:33 compute-2 python3.9[44949]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:41:33 compute-2 sudo[44947]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:33 compute-2 sudo[45100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnvulkxloltjsrfmkxhacdaraofcyaim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582093.3642719-1608-216706143744927/AnsiballZ_systemd.py'
Dec 01 09:41:33 compute-2 sudo[45100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:33 compute-2 python3.9[45102]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:41:34 compute-2 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 01 09:41:34 compute-2 systemd[1]: Stopped Apply Kernel Variables.
Dec 01 09:41:34 compute-2 systemd[1]: Stopping Apply Kernel Variables...
Dec 01 09:41:34 compute-2 systemd[1]: Starting Apply Kernel Variables...
Dec 01 09:41:34 compute-2 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 01 09:41:34 compute-2 systemd[1]: Finished Apply Kernel Variables.
Dec 01 09:41:34 compute-2 sudo[45100]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:34 compute-2 sshd-session[31491]: Connection closed by 192.168.122.30 port 53678
Dec 01 09:41:34 compute-2 sshd-session[31488]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:41:34 compute-2 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Dec 01 09:41:34 compute-2 systemd[1]: session-10.scope: Deactivated successfully.
Dec 01 09:41:34 compute-2 systemd[1]: session-10.scope: Consumed 2min 11.608s CPU time.
Dec 01 09:41:34 compute-2 systemd-logind[795]: Removed session 10.
Dec 01 09:41:40 compute-2 sshd-session[45132]: Accepted publickey for zuul from 192.168.122.30 port 54936 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:41:40 compute-2 systemd-logind[795]: New session 11 of user zuul.
Dec 01 09:41:40 compute-2 systemd[1]: Started Session 11 of User zuul.
Dec 01 09:41:40 compute-2 sshd-session[45132]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:41:41 compute-2 python3.9[45285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:41:42 compute-2 sudo[45439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltydmiraoiynhlitujhrtwwxycemljgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582102.1306498-70-160462003286582/AnsiballZ_getent.py'
Dec 01 09:41:42 compute-2 sudo[45439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:42 compute-2 python3.9[45441]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 01 09:41:42 compute-2 sudo[45439]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:43 compute-2 sudo[45592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfdgbzooyjwqiqjwwxcflxzrrutkhigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582102.9682262-94-264497902230864/AnsiballZ_group.py'
Dec 01 09:41:43 compute-2 sudo[45592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:43 compute-2 python3.9[45594]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:41:43 compute-2 groupadd[45595]: group added to /etc/group: name=openvswitch, GID=42476
Dec 01 09:41:43 compute-2 groupadd[45595]: group added to /etc/gshadow: name=openvswitch
Dec 01 09:41:43 compute-2 groupadd[45595]: new group: name=openvswitch, GID=42476
Dec 01 09:41:43 compute-2 sudo[45592]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:44 compute-2 sudo[45750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mojelgcetitmcmkqkpciegnuuxhtveqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582103.964772-118-216543023387937/AnsiballZ_user.py'
Dec 01 09:41:44 compute-2 sudo[45750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:44 compute-2 python3.9[45752]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 09:41:44 compute-2 useradd[45754]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 09:41:44 compute-2 useradd[45754]: add 'openvswitch' to group 'hugetlbfs'
Dec 01 09:41:44 compute-2 useradd[45754]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 01 09:41:44 compute-2 sudo[45750]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:45 compute-2 sudo[45910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimwjzmqfzvrkhhfzdoccmovkrvxwnor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582105.1382875-148-199171959846492/AnsiballZ_setup.py'
Dec 01 09:41:45 compute-2 sudo[45910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:45 compute-2 python3.9[45912]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:41:45 compute-2 sudo[45910]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:46 compute-2 sudo[45994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdavllxqigmyelbnepkewveudxqvqrpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582105.1382875-148-199171959846492/AnsiballZ_dnf.py'
Dec 01 09:41:46 compute-2 sudo[45994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:46 compute-2 python3.9[45996]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:41:49 compute-2 sudo[45994]: pam_unix(sudo:session): session closed for user root
Dec 01 09:41:50 compute-2 sudo[46158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcjpcuxnrwqwnswgrsovhfsvjlgfmwug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582110.084298-190-100052243351455/AnsiballZ_dnf.py'
Dec 01 09:41:50 compute-2 sudo[46158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:41:50 compute-2 python3.9[46160]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:42:05 compute-2 kernel: SELinux:  Converting 2731 SID table entries...
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:42:05 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:42:05 compute-2 groupadd[46184]: group added to /etc/group: name=unbound, GID=993
Dec 01 09:42:05 compute-2 groupadd[46184]: group added to /etc/gshadow: name=unbound
Dec 01 09:42:05 compute-2 groupadd[46184]: new group: name=unbound, GID=993
Dec 01 09:42:05 compute-2 useradd[46191]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 01 09:42:05 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 01 09:42:05 compute-2 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 01 09:42:06 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:42:06 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:42:06 compute-2 systemd[1]: Reloading.
Dec 01 09:42:06 compute-2 systemd-rc-local-generator[46688]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:42:06 compute-2 systemd-sysv-generator[46691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:42:06 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:42:07 compute-2 sudo[46158]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:07 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:42:07 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:42:07 compute-2 systemd[1]: run-rf89aacf991b342eaba5ea7e1bb5699c7.service: Deactivated successfully.
Dec 01 09:42:07 compute-2 sudo[47258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aasfpumbxrvtrhfvbywmohuymqcghhka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582127.4181657-214-20141134217510/AnsiballZ_systemd.py'
Dec 01 09:42:07 compute-2 sudo[47258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:08 compute-2 python3.9[47260]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:42:08 compute-2 systemd[1]: Reloading.
Dec 01 09:42:08 compute-2 systemd-sysv-generator[47294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:42:08 compute-2 systemd-rc-local-generator[47291]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:42:08 compute-2 systemd[1]: Starting Open vSwitch Database Unit...
Dec 01 09:42:08 compute-2 chown[47303]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 01 09:42:08 compute-2 ovs-ctl[47308]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 01 09:42:08 compute-2 ovs-ctl[47308]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 01 09:42:08 compute-2 ovs-ctl[47308]: Starting ovsdb-server [  OK  ]
Dec 01 09:42:08 compute-2 ovs-vsctl[47357]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 01 09:42:08 compute-2 ovs-vsctl[47377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"968d9d26-f45d-4d49-addd-0befc9c8f4a3\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 01 09:42:08 compute-2 ovs-ctl[47308]: Configuring Open vSwitch system IDs [  OK  ]
Dec 01 09:42:08 compute-2 ovs-vsctl[47383]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec 01 09:42:08 compute-2 ovs-ctl[47308]: Enabling remote OVSDB managers [  OK  ]
Dec 01 09:42:08 compute-2 systemd[1]: Started Open vSwitch Database Unit.
Dec 01 09:42:08 compute-2 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 01 09:42:08 compute-2 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 01 09:42:08 compute-2 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 01 09:42:08 compute-2 kernel: openvswitch: Open vSwitch switching datapath
Dec 01 09:42:08 compute-2 ovs-ctl[47427]: Inserting openvswitch module [  OK  ]
Dec 01 09:42:09 compute-2 ovs-ctl[47396]: Starting ovs-vswitchd [  OK  ]
Dec 01 09:42:09 compute-2 ovs-ctl[47396]: Enabling remote OVSDB managers [  OK  ]
Dec 01 09:42:09 compute-2 ovs-vsctl[47445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec 01 09:42:09 compute-2 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 01 09:42:09 compute-2 systemd[1]: Starting Open vSwitch...
Dec 01 09:42:09 compute-2 systemd[1]: Finished Open vSwitch.
Dec 01 09:42:09 compute-2 sudo[47258]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:10 compute-2 python3.9[47596]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:42:11 compute-2 sudo[47746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osmhdkmosnxnplvouqrhykpucbauyoin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582130.8151739-268-141094787972781/AnsiballZ_sefcontext.py'
Dec 01 09:42:11 compute-2 sudo[47746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:11 compute-2 python3.9[47748]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 01 09:42:12 compute-2 kernel: SELinux:  Converting 2745 SID table entries...
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 09:42:12 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 09:42:12 compute-2 sudo[47746]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:13 compute-2 python3.9[47903]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:42:14 compute-2 sudo[48059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxphkpihbkivniszbiqarqlyesnkmtfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582134.3575642-322-244313958722122/AnsiballZ_dnf.py'
Dec 01 09:42:14 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 01 09:42:14 compute-2 sudo[48059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:14 compute-2 python3.9[48061]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:42:16 compute-2 sudo[48059]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:16 compute-2 sudo[48212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxayfjwyylwdtbmmrjdakiglieoreouc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582136.4235666-347-111339577773637/AnsiballZ_command.py'
Dec 01 09:42:16 compute-2 sudo[48212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:17 compute-2 python3.9[48214]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:42:17 compute-2 sudo[48212]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:18 compute-2 sudo[48499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qilopewtezfljtwxkkullwgefuchqxcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582138.0146937-371-142369364952192/AnsiballZ_file.py'
Dec 01 09:42:18 compute-2 sudo[48499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:19 compute-2 python3.9[48501]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:42:19 compute-2 sudo[48499]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:19 compute-2 python3.9[48651]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:42:20 compute-2 sudo[48803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hahhkarxdzfuuykpjoowfdygyxiwlkqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582140.1874497-418-271474173161487/AnsiballZ_dnf.py'
Dec 01 09:42:20 compute-2 sudo[48803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:20 compute-2 python3.9[48805]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:42:22 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:42:22 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:42:22 compute-2 systemd[1]: Reloading.
Dec 01 09:42:22 compute-2 systemd-sysv-generator[48847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:42:22 compute-2 systemd-rc-local-generator[48842]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:42:23 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:42:23 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:42:23 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:42:23 compute-2 systemd[1]: run-r5a83ee8cf2c04985b9e957a305ffb29c.service: Deactivated successfully.
Dec 01 09:42:23 compute-2 sudo[48803]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:23 compute-2 sudo[49122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyrdmaajdxuchidyljghigdtydxkcpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582143.5705364-442-221772358982097/AnsiballZ_systemd.py'
Dec 01 09:42:23 compute-2 sudo[49122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:24 compute-2 python3.9[49124]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:42:24 compute-2 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 01 09:42:24 compute-2 systemd[1]: Stopped Network Manager Wait Online.
Dec 01 09:42:24 compute-2 systemd[1]: Stopping Network Manager Wait Online...
Dec 01 09:42:24 compute-2 systemd[1]: Stopping Network Manager...
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1775] caught SIGTERM, shutting down normally.
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): canceled DHCP transaction
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): state changed no lease
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1789] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:42:24 compute-2 NetworkManager[7192]: <info>  [1764582144.1847] exiting (success)
Dec 01 09:42:24 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:42:24 compute-2 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 01 09:42:24 compute-2 systemd[1]: Stopped Network Manager.
Dec 01 09:42:24 compute-2 systemd[1]: NetworkManager.service: Consumed 10.863s CPU time, 4.3M memory peak, read 0B from disk, written 20.0K to disk.
Dec 01 09:42:24 compute-2 systemd[1]: Starting Network Manager...
Dec 01 09:42:24 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.2523] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.2526] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.2594] manager[0x55e132082090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 01 09:42:24 compute-2 systemd[1]: Starting Hostname Service...
Dec 01 09:42:24 compute-2 systemd[1]: Started Hostname Service.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3316] hostname: hostname: using hostnamed
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3317] hostname: static hostname changed from (none) to "compute-2"
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3320] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3324] manager[0x55e132082090]: rfkill: Wi-Fi hardware radio set enabled
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3324] manager[0x55e132082090]: rfkill: WWAN hardware radio set enabled
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3344] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3353] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3353] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3354] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3354] manager: Networking is enabled by state file
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3356] settings: Loaded settings plugin: keyfile (internal)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3359] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3385] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3394] dhcp: init: Using DHCP client 'internal'
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3397] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3401] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3406] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3413] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3420] device (eth0): carrier: link connected
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3424] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3430] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3430] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3436] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3442] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3447] device (eth1): carrier: link connected
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3450] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3455] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f) (indicated)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3456] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3461] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3466] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec 01 09:42:24 compute-2 systemd[1]: Started Network Manager.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3472] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3494] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3497] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3498] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3500] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3501] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3503] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3504] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3506] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3513] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3515] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3544] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3556] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3563] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3565] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3570] device (lo): Activation: successful, device activated.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3576] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3584] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 01 09:42:24 compute-2 systemd[1]: Starting Network Manager Wait Online...
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3648] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3656] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3663] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3666] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3669] device (eth1): Activation: successful, device activated.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3680] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3681] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3684] manager: NetworkManager state is now CONNECTED_SITE
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3687] device (eth0): Activation: successful, device activated.
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3691] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 01 09:42:24 compute-2 NetworkManager[49132]: <info>  [1764582144.3693] manager: startup complete
Dec 01 09:42:24 compute-2 sudo[49122]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:24 compute-2 systemd[1]: Finished Network Manager Wait Online.
Dec 01 09:42:24 compute-2 sudo[49348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgxdidbybluorwzpfeekjyxivtjsmjbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582144.6812117-466-79342376873076/AnsiballZ_dnf.py'
Dec 01 09:42:24 compute-2 sudo[49348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:25 compute-2 python3.9[49350]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:42:30 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 09:42:30 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 09:42:30 compute-2 systemd[1]: Reloading.
Dec 01 09:42:30 compute-2 systemd-rc-local-generator[49395]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:42:30 compute-2 systemd-sysv-generator[49400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:42:31 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 09:42:31 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 09:42:31 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 09:42:31 compute-2 systemd[1]: run-r312bc9dcb5b94a60b5eaecc4bcd71b2e.service: Deactivated successfully.
Dec 01 09:42:32 compute-2 sudo[49348]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:32 compute-2 sudo[49806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijrvsejvdgoftagbpsxtlwsvftmfoxqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582152.5633862-502-152974921438702/AnsiballZ_stat.py'
Dec 01 09:42:32 compute-2 sudo[49806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:33 compute-2 python3.9[49808]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:42:33 compute-2 sudo[49806]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:33 compute-2 sudo[49958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scyavihsxjpqbretvfpmewqnwmgcpvmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582153.2957625-529-66526152709642/AnsiballZ_ini_file.py'
Dec 01 09:42:33 compute-2 sudo[49958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:33 compute-2 python3.9[49960]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:33 compute-2 sudo[49958]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:34 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:42:34 compute-2 sudo[50112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnqjyihmsgeyobszajlxxqflmagnhkwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582154.3053026-559-64970500585870/AnsiballZ_ini_file.py'
Dec 01 09:42:34 compute-2 sudo[50112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:34 compute-2 python3.9[50114]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:34 compute-2 sudo[50112]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:35 compute-2 sudo[50264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfvzfrhorbjboorszpolvrzipssmjwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582154.8976517-559-102545715805820/AnsiballZ_ini_file.py'
Dec 01 09:42:35 compute-2 sudo[50264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:35 compute-2 python3.9[50266]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:35 compute-2 sudo[50264]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:35 compute-2 sudo[50416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tptnnbkimgrstgqxxqxzeuhmvcbdckdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582155.6069171-604-165939592826870/AnsiballZ_ini_file.py'
Dec 01 09:42:35 compute-2 sudo[50416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:36 compute-2 python3.9[50418]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:36 compute-2 sudo[50416]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:36 compute-2 sudo[50568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptzngnidefszhlolyxjnjbcjixufgwly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582156.2129083-604-55401250525850/AnsiballZ_ini_file.py'
Dec 01 09:42:36 compute-2 sudo[50568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:36 compute-2 python3.9[50570]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:36 compute-2 sudo[50568]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:37 compute-2 sudo[50720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjsdcykzzfwuzcggvxzdtdwmoltbpsmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582156.953682-649-136432848574910/AnsiballZ_stat.py'
Dec 01 09:42:37 compute-2 sudo[50720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:37 compute-2 python3.9[50722]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:42:37 compute-2 sudo[50720]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:37 compute-2 sudo[50843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hztyxsrshcfvvubufdlnjpluvhijuntd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582156.953682-649-136432848574910/AnsiballZ_copy.py'
Dec 01 09:42:37 compute-2 sudo[50843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:38 compute-2 python3.9[50845]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582156.953682-649-136432848574910/.source _original_basename=.6u13nczr follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:38 compute-2 sudo[50843]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:38 compute-2 sudo[50995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiqjbbvzfwjftsmmblzybnlrqlqlaeoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582158.38473-694-90831204248366/AnsiballZ_file.py'
Dec 01 09:42:38 compute-2 sudo[50995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:38 compute-2 python3.9[50997]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:38 compute-2 sudo[50995]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:39 compute-2 sudo[51147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoinxretljpvsmbxxppwfffxyuwxlxma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582159.1213725-719-83644905695988/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 01 09:42:39 compute-2 sudo[51147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:39 compute-2 python3.9[51149]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 01 09:42:39 compute-2 sudo[51147]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:40 compute-2 sudo[51299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpbcylbpotyjhjgylzwhmqdechsubhkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582159.9873672-745-151912235393450/AnsiballZ_file.py'
Dec 01 09:42:40 compute-2 sudo[51299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:40 compute-2 python3.9[51301]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:40 compute-2 sudo[51299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:41 compute-2 sudo[51451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovtjkiczgyparwkmrrlqaeecvhxpthv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582160.813371-775-66920296176488/AnsiballZ_stat.py'
Dec 01 09:42:41 compute-2 sudo[51451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:41 compute-2 sudo[51451]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:41 compute-2 sudo[51574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duhkzywjfjigdyypnjjvlcdqjaimvwqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582160.813371-775-66920296176488/AnsiballZ_copy.py'
Dec 01 09:42:41 compute-2 sudo[51574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:41 compute-2 sudo[51574]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:42 compute-2 sudo[51726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddnjjjfonkzjpvddgeddhvnqmjlncwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582162.1979604-820-244270749551687/AnsiballZ_slurp.py'
Dec 01 09:42:42 compute-2 sudo[51726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:42 compute-2 python3.9[51728]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 01 09:42:42 compute-2 sudo[51726]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:43 compute-2 sudo[51901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmxcuhqhmcbypiqrpibchocknmjsszh ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582163.1852202-847-280430874517915/async_wrapper.py j17587752371 300 /home/zuul/.ansible/tmp/ansible-tmp-1764582163.1852202-847-280430874517915/AnsiballZ_edpm_os_net_config.py _'
Dec 01 09:42:43 compute-2 sudo[51901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:44 compute-2 ansible-async_wrapper.py[51903]: Invoked with j17587752371 300 /home/zuul/.ansible/tmp/ansible-tmp-1764582163.1852202-847-280430874517915/AnsiballZ_edpm_os_net_config.py _
Dec 01 09:42:44 compute-2 ansible-async_wrapper.py[51906]: Starting module and watcher
Dec 01 09:42:44 compute-2 ansible-async_wrapper.py[51906]: Start watching 51907 (300)
Dec 01 09:42:44 compute-2 ansible-async_wrapper.py[51907]: Start module (51907)
Dec 01 09:42:44 compute-2 ansible-async_wrapper.py[51903]: Return async_wrapper task started.
Dec 01 09:42:44 compute-2 sudo[51901]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:44 compute-2 python3.9[51908]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 01 09:42:44 compute-2 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 01 09:42:44 compute-2 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 01 09:42:44 compute-2 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 01 09:42:44 compute-2 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 01 09:42:44 compute-2 kernel: cfg80211: failed to load regulatory.db
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0308] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0326] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0738] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0739] audit: op="connection-add" uuid="6983ee3b-6ac0-4201-aaa8-745f2bdeec97" name="br-ex-br" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0750] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0752] audit: op="connection-add" uuid="4f223009-ae91-4014-bc19-eff9ca4bc818" name="br-ex-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0762] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0763] audit: op="connection-add" uuid="78922a2c-c1da-4f79-89ed-2c7e244e5438" name="eth1-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0774] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0775] audit: op="connection-add" uuid="22de3c57-35c4-4e26-a867-5397dcc0d146" name="vlan20-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0785] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0786] audit: op="connection-add" uuid="6b3dc5ff-3716-41ce-b780-cfc1b7e27abc" name="vlan21-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0795] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0797] audit: op="connection-add" uuid="ff380093-d1b6-4b31-b4cc-05101b9678c2" name="vlan22-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0807] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0809] audit: op="connection-add" uuid="c0eb9127-f4dd-4dd1-9380-f84dc9b7a7b0" name="vlan23-port" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0828] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0842] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0843] audit: op="connection-add" uuid="d3eb1f33-c30a-4048-8c77-d4c11e08232f" name="br-ex-if" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0884] audit: op="connection-update" uuid="6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,connection.timestamp,connection.master,connection.port-type,connection.controller,connection.slave-type,ipv4.addresses,ipv4.method,ipv4.dns,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv6.addresses,ipv6.method,ipv6.dns,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0898] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0900] audit: op="connection-add" uuid="f74ed6c1-ce44-47ff-8df0-dd8dbb8b3707" name="vlan20-if" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0912] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0914] audit: op="connection-add" uuid="fbd8a8e7-35c2-4971-bf6a-2dd92cde7ae1" name="vlan21-if" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0927] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0929] audit: op="connection-add" uuid="645e20e9-649f-45b1-a947-9d9936da2129" name="vlan22-if" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0942] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0943] audit: op="connection-add" uuid="fd570e13-d6f0-4151-aaf6-41c48aa466f3" name="vlan23-if" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0952] audit: op="connection-delete" uuid="094bb5e5-ea9a-3656-b952-5d4c84d86268" name="Wired connection 1" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0962] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0971] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0974] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6983ee3b-6ac0-4201-aaa8-745f2bdeec97)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0975] audit: op="connection-activate" uuid="6983ee3b-6ac0-4201-aaa8-745f2bdeec97" name="br-ex-br" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0978] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0983] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0986] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (4f223009-ae91-4014-bc19-eff9ca4bc818)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0988] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0992] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0996] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (78922a2c-c1da-4f79-89ed-2c7e244e5438)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.0998] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1002] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1006] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (22de3c57-35c4-4e26-a867-5397dcc0d146)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1007] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1012] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1015] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6b3dc5ff-3716-41ce-b780-cfc1b7e27abc)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1017] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1022] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1026] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ff380093-d1b6-4b31-b4cc-05101b9678c2)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1028] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1032] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1036] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (c0eb9127-f4dd-4dd1-9380-f84dc9b7a7b0)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1038] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1039] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1041] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1046] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1050] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1053] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d3eb1f33-c30a-4048-8c77-d4c11e08232f)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1054] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1057] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1059] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1060] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1061] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1070] device (eth1): disconnecting for new activation request.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1070] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1073] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1075] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1077] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1080] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1084] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1087] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f74ed6c1-ce44-47ff-8df0-dd8dbb8b3707)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1089] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1091] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1093] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1095] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1097] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1101] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1105] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (fbd8a8e7-35c2-4971-bf6a-2dd92cde7ae1)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1106] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1108] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1109] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1111] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1113] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1117] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1119] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (645e20e9-649f-45b1-a947-9d9936da2129)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1120] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1122] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1124] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1126] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1128] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1131] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1135] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (fd570e13-d6f0-4151-aaf6-41c48aa466f3)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1136] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1138] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1140] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1141] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1143] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1154] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1155] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1161] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1164] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1172] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1176] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1180] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1183] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1185] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1190] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 kernel: ovs-system: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1196] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1199] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1200] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1205] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1209] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 systemd-udevd[51914]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:42:46 compute-2 kernel: Timeout policy base is empty
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1212] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1213] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1218] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1222] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1225] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1227] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1231] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): canceled DHCP transaction
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): state changed no lease
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1237] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1246] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1251] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51909 uid=0 result="fail" reason="Device is not activated"
Dec 01 09:42:46 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1297] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1301] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1306] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 01 09:42:46 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1369] device (eth1): disconnecting for new activation request.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1370] audit: op="connection-activate" uuid="6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f" name="ci-private-network" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1371] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1378] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 01 09:42:46 compute-2 kernel: br-ex: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1528] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1535] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1540] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1558] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1562] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1570] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1575] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1583] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1584] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1586] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1588] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1590] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1592] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1595] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1598] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1605] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1610] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1615] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1619] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1624] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1629] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1634] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1639] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1643] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1649] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1654] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1661] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1669] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1673] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1683] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:42:46 compute-2 kernel: vlan22: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1702] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1711] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1714] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1721] device (eth1): Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1732] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1735] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1740] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 kernel: vlan21: entered promiscuous mode
Dec 01 09:42:46 compute-2 systemd-udevd[51913]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1791] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1802] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 kernel: vlan23: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1822] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1825] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1831] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1868] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1884] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 kernel: vlan20: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1916] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1934] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1946] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1950] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1959] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1971] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1973] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.1981] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.2022] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.2037] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.2056] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.2057] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 01 09:42:46 compute-2 NetworkManager[49132]: <info>  [1764582166.2066] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.3384] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.4932] checkpoint[0x55e132058950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.4935] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec 01 09:42:47 compute-2 sudo[52265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewiuzaljwjbfufduvokkbhlyeudfrjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582167.183942-847-183073291758276/AnsiballZ_async_status.py'
Dec 01 09:42:47 compute-2 sudo[52265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:47 compute-2 python3.9[52267]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=status _async_dir=/root/.ansible_async
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.7923] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.7934] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec 01 09:42:47 compute-2 sudo[52265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:47 compute-2 NetworkManager[49132]: <info>  [1764582167.9998] audit: op="networking-control" arg="global-dns-configuration" pid=51909 uid=0 result="success"
Dec 01 09:42:48 compute-2 NetworkManager[49132]: <info>  [1764582168.0024] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 01 09:42:48 compute-2 NetworkManager[49132]: <info>  [1764582168.0048] audit: op="networking-control" arg="global-dns-configuration" pid=51909 uid=0 result="success"
Dec 01 09:42:48 compute-2 NetworkManager[49132]: <info>  [1764582168.0070] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec 01 09:42:48 compute-2 NetworkManager[49132]: <info>  [1764582168.1324] checkpoint[0x55e132058a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 01 09:42:48 compute-2 NetworkManager[49132]: <info>  [1764582168.1328] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec 01 09:42:48 compute-2 ansible-async_wrapper.py[51907]: Module complete (51907)
Dec 01 09:42:49 compute-2 ansible-async_wrapper.py[51906]: Done in kid B.
Dec 01 09:42:51 compute-2 sudo[52370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcyhekxcuwpgwivzhwoybrvvbomzbxls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582167.183942-847-183073291758276/AnsiballZ_async_status.py'
Dec 01 09:42:51 compute-2 sudo[52370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:51 compute-2 python3.9[52372]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=status _async_dir=/root/.ansible_async
Dec 01 09:42:51 compute-2 sudo[52370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:51 compute-2 sudo[52470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaeewtxfwccjixhssnpbjpqpfrvwegfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582167.183942-847-183073291758276/AnsiballZ_async_status.py'
Dec 01 09:42:51 compute-2 sudo[52470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:51 compute-2 python3.9[52472]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=cleanup _async_dir=/root/.ansible_async
Dec 01 09:42:51 compute-2 sudo[52470]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:52 compute-2 sudo[52624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlveitynocdqcznmvmnvxhjnamlmnhdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582172.2010455-928-49938939344334/AnsiballZ_stat.py'
Dec 01 09:42:52 compute-2 sudo[52624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:52 compute-2 sshd-session[52473]: Received disconnect from 102.213.183.66 port 47468:11: Bye Bye [preauth]
Dec 01 09:42:52 compute-2 sshd-session[52473]: Disconnected from authenticating user root 102.213.183.66 port 47468 [preauth]
Dec 01 09:42:52 compute-2 python3.9[52626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:42:52 compute-2 sudo[52624]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-2 sudo[52747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmljmqpyjncvknkpklghnsclbbgbeucc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582172.2010455-928-49938939344334/AnsiballZ_copy.py'
Dec 01 09:42:53 compute-2 sudo[52747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:53 compute-2 python3.9[52749]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582172.2010455-928-49938939344334/.source.returncode _original_basename=.xhk9_zfp follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:53 compute-2 sudo[52747]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:53 compute-2 sudo[52899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlznqtvwmrncculpgnaqroskztoxwlma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582173.561003-976-35506184447841/AnsiballZ_stat.py'
Dec 01 09:42:53 compute-2 sudo[52899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:53 compute-2 python3.9[52901]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:42:53 compute-2 sudo[52899]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:54 compute-2 sudo[53022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvyhldccseqskxosomspusbejoawtvng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582173.561003-976-35506184447841/AnsiballZ_copy.py'
Dec 01 09:42:54 compute-2 sudo[53022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:54 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 09:42:54 compute-2 python3.9[53024]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582173.561003-976-35506184447841/.source.cfg _original_basename=.hdorbgwj follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:42:54 compute-2 sudo[53022]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:55 compute-2 sudo[53177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzsposvlmwnxkulkmrgxprhggfphltlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582175.1297302-1022-272331565067722/AnsiballZ_systemd.py'
Dec 01 09:42:55 compute-2 sudo[53177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:42:55 compute-2 python3.9[53179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:42:55 compute-2 systemd[1]: Reloading Network Manager...
Dec 01 09:42:55 compute-2 NetworkManager[49132]: <info>  [1764582175.7831] audit: op="reload" arg="0" pid=53183 uid=0 result="success"
Dec 01 09:42:55 compute-2 NetworkManager[49132]: <info>  [1764582175.7839] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 01 09:42:55 compute-2 systemd[1]: Reloaded Network Manager.
Dec 01 09:42:55 compute-2 sudo[53177]: pam_unix(sudo:session): session closed for user root
Dec 01 09:42:56 compute-2 sshd-session[45135]: Connection closed by 192.168.122.30 port 54936
Dec 01 09:42:56 compute-2 sshd-session[45132]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:42:56 compute-2 systemd[1]: session-11.scope: Deactivated successfully.
Dec 01 09:42:56 compute-2 systemd[1]: session-11.scope: Consumed 53.625s CPU time.
Dec 01 09:42:56 compute-2 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Dec 01 09:42:56 compute-2 systemd-logind[795]: Removed session 11.
Dec 01 09:43:01 compute-2 sshd-session[53214]: Accepted publickey for zuul from 192.168.122.30 port 53786 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:43:01 compute-2 systemd-logind[795]: New session 12 of user zuul.
Dec 01 09:43:01 compute-2 systemd[1]: Started Session 12 of User zuul.
Dec 01 09:43:01 compute-2 sshd-session[53214]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:43:02 compute-2 python3.9[53367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:43:03 compute-2 python3.9[53521]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:43:05 compute-2 python3.9[53715]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:43:05 compute-2 sshd-session[53217]: Connection closed by 192.168.122.30 port 53786
Dec 01 09:43:05 compute-2 sshd-session[53214]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:43:05 compute-2 systemd[1]: session-12.scope: Deactivated successfully.
Dec 01 09:43:05 compute-2 systemd[1]: session-12.scope: Consumed 2.195s CPU time.
Dec 01 09:43:05 compute-2 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Dec 01 09:43:05 compute-2 systemd-logind[795]: Removed session 12.
Dec 01 09:43:05 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 01 09:43:10 compute-2 sshd-session[53744]: Accepted publickey for zuul from 192.168.122.30 port 36228 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:43:10 compute-2 systemd-logind[795]: New session 13 of user zuul.
Dec 01 09:43:10 compute-2 systemd[1]: Started Session 13 of User zuul.
Dec 01 09:43:10 compute-2 sshd-session[53744]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:43:11 compute-2 python3.9[53897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:43:13 compute-2 python3.9[54051]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:43:13 compute-2 sudo[54206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgbduhvmoawflatgajjvpdmzlznbbjec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582193.5275838-82-98178216909847/AnsiballZ_setup.py'
Dec 01 09:43:13 compute-2 sudo[54206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:14 compute-2 python3.9[54208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:43:14 compute-2 sudo[54206]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:14 compute-2 sudo[54290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbjilfgvnfasvtvbmpzdhwukxjomgnfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582193.5275838-82-98178216909847/AnsiballZ_dnf.py'
Dec 01 09:43:14 compute-2 sudo[54290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:14 compute-2 python3.9[54292]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:43:16 compute-2 sudo[54290]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:16 compute-2 sudo[54443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yejaxyxtuooqnipkoalvdhwcgsaewtxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582196.4547992-118-201878484218128/AnsiballZ_setup.py'
Dec 01 09:43:16 compute-2 sudo[54443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:17 compute-2 python3.9[54445]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:43:17 compute-2 sudo[54443]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:18 compute-2 sudo[54639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziwzmrpqigxmuoisnmstudkfigkhgxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582197.726788-151-236920059685686/AnsiballZ_file.py'
Dec 01 09:43:18 compute-2 sudo[54639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:18 compute-2 python3.9[54641]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:43:18 compute-2 sudo[54639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:19 compute-2 sudo[54791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sevndhmvsaizkitcuzzccatchxpeebht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582198.6668599-176-85993500809164/AnsiballZ_command.py'
Dec 01 09:43:19 compute-2 sudo[54791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:19 compute-2 python3.9[54793]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:43:19 compute-2 podman[54794]: 2025-12-01 09:43:19.600929278 +0000 UTC m=+0.064879186 system refresh
Dec 01 09:43:19 compute-2 sudo[54791]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:20 compute-2 sudo[54955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpxwycimlqytqijklaghuhpgedlzkjeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582199.8220577-200-218937046755078/AnsiballZ_stat.py'
Dec 01 09:43:20 compute-2 sudo[54955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:20 compute-2 python3.9[54957]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:43:20 compute-2 sudo[54955]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:20 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:43:20 compute-2 sudo[55078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqovbludczwmbtjukuuclbyekndzcosx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582199.8220577-200-218937046755078/AnsiballZ_copy.py'
Dec 01 09:43:20 compute-2 sudo[55078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:21 compute-2 python3.9[55080]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582199.8220577-200-218937046755078/.source.json follow=False _original_basename=podman_network_config.j2 checksum=0cef8eebceb501cd3b4718b7ff3ce62bde3f8458 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:43:21 compute-2 sudo[55078]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:21 compute-2 sudo[55230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onqjblgjupilbsamibzkjfeffcdrvmvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582201.409524-244-278451014373609/AnsiballZ_stat.py'
Dec 01 09:43:21 compute-2 sudo[55230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:21 compute-2 python3.9[55232]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:43:21 compute-2 sudo[55230]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:22 compute-2 sudo[55353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uawdatrfzqnkhgmjoqhztwcaqucykuma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582201.409524-244-278451014373609/AnsiballZ_copy.py'
Dec 01 09:43:22 compute-2 sudo[55353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:22 compute-2 python3.9[55355]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582201.409524-244-278451014373609/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a92d4bce7d9cad3a31d9a297b9e21f629ee446cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:43:22 compute-2 sudo[55353]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:23 compute-2 sudo[55505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjygtyjfpvrkfkxgmoxzlivhxexosfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582202.8229117-292-112692082931352/AnsiballZ_ini_file.py'
Dec 01 09:43:23 compute-2 sudo[55505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:23 compute-2 python3.9[55507]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:43:23 compute-2 sudo[55505]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:23 compute-2 sudo[55657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flreyhxrgtanaekcbyyfqxpmkywdswba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582203.5753014-292-141991585727724/AnsiballZ_ini_file.py'
Dec 01 09:43:23 compute-2 sudo[55657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:24 compute-2 python3.9[55659]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:43:24 compute-2 sudo[55657]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:24 compute-2 sudo[55809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-domwsvirzrjjnhswafqcuvewycuoveon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582204.1846137-292-147292493726590/AnsiballZ_ini_file.py'
Dec 01 09:43:24 compute-2 sudo[55809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:24 compute-2 python3.9[55811]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:43:24 compute-2 sudo[55809]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:25 compute-2 sudo[55961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sowejdxnmiwqrlrxrpzcxlrfrveahoqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582204.7823617-292-108383389690549/AnsiballZ_ini_file.py'
Dec 01 09:43:25 compute-2 sudo[55961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:25 compute-2 python3.9[55963]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:43:25 compute-2 sudo[55961]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:26 compute-2 sudo[56113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdfjurhmwnioiwxlaqwoqmjbtjhodqjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582206.0598805-385-199930562575901/AnsiballZ_dnf.py'
Dec 01 09:43:26 compute-2 sudo[56113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:26 compute-2 python3.9[56115]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:43:28 compute-2 sudo[56113]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:28 compute-2 sudo[56266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmnlksczlddlxlgzbcvkuylttxzgvcwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582208.6160817-418-15141940665915/AnsiballZ_setup.py'
Dec 01 09:43:28 compute-2 sudo[56266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:29 compute-2 python3.9[56268]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:43:29 compute-2 sudo[56266]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:29 compute-2 sudo[56420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntctqqsjxxzssvatillkhwhpodstbsdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582209.4437976-442-127603171915539/AnsiballZ_stat.py'
Dec 01 09:43:29 compute-2 sudo[56420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:29 compute-2 python3.9[56422]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:43:29 compute-2 sudo[56420]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:30 compute-2 sudo[56572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifvvimutaioqedrmnpgkmsvvvcccrrdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582210.2059324-469-61586517975621/AnsiballZ_stat.py'
Dec 01 09:43:30 compute-2 sudo[56572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:30 compute-2 python3.9[56574]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:43:30 compute-2 sudo[56572]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:31 compute-2 sudo[56724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fetajujzdaiqgogvtzkbayakhpuhszpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582211.0690055-499-90295050172756/AnsiballZ_command.py'
Dec 01 09:43:31 compute-2 sudo[56724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:31 compute-2 python3.9[56726]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:43:31 compute-2 sudo[56724]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:32 compute-2 sudo[56877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-movtrdsibiwqvtyyqagefsxczgjmpcwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582211.93162-530-67244460009557/AnsiballZ_service_facts.py'
Dec 01 09:43:32 compute-2 sudo[56877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:32 compute-2 python3.9[56879]: ansible-service_facts Invoked
Dec 01 09:43:32 compute-2 network[56896]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:43:32 compute-2 network[56897]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:43:32 compute-2 network[56898]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:43:35 compute-2 sudo[56877]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:37 compute-2 sudo[57181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksalktqytifmtyacfnkmqqxwjpzkpdab ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764582217.1907613-575-271570705050387/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764582217.1907613-575-271570705050387/args'
Dec 01 09:43:37 compute-2 sudo[57181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:37 compute-2 sudo[57181]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:38 compute-2 sudo[57348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbipjlegcfgieizakogzrmdyqefiofhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582218.0060368-608-272960666288239/AnsiballZ_dnf.py'
Dec 01 09:43:38 compute-2 sudo[57348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:38 compute-2 python3.9[57350]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:43:40 compute-2 sudo[57348]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:41 compute-2 sshd-session[57352]: Invalid user developer from 45.78.219.119 port 60696
Dec 01 09:43:41 compute-2 sudo[57503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twahgntgpmyukwapbgcrhkouofbunupg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582221.3385935-647-250564767199078/AnsiballZ_package_facts.py'
Dec 01 09:43:41 compute-2 sudo[57503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:42 compute-2 sshd-session[57352]: Received disconnect from 45.78.219.119 port 60696:11: Bye Bye [preauth]
Dec 01 09:43:42 compute-2 sshd-session[57352]: Disconnected from invalid user developer 45.78.219.119 port 60696 [preauth]
Dec 01 09:43:42 compute-2 python3.9[57505]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 01 09:43:42 compute-2 sudo[57503]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:43 compute-2 sudo[57655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymhaddkdcdakgjjonwdokgjrnusnwpif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582223.2148726-677-36104949411198/AnsiballZ_stat.py'
Dec 01 09:43:43 compute-2 sudo[57655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:43 compute-2 python3.9[57657]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:43:43 compute-2 sudo[57655]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:44 compute-2 sudo[57780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqfnrklbymzdolpijluuzwnqtnfuqxiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582223.2148726-677-36104949411198/AnsiballZ_copy.py'
Dec 01 09:43:44 compute-2 sudo[57780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:44 compute-2 python3.9[57782]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582223.2148726-677-36104949411198/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:43:44 compute-2 sudo[57780]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:45 compute-2 sudo[57934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjuoyoipkwlcmdketoowcwfnqbqfafnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582224.7604392-723-179384024438917/AnsiballZ_stat.py'
Dec 01 09:43:45 compute-2 sudo[57934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:45 compute-2 python3.9[57936]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:43:45 compute-2 sudo[57934]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:45 compute-2 sudo[58059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-demnrfkcbhsbinneixngdstgbobfkjme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582224.7604392-723-179384024438917/AnsiballZ_copy.py'
Dec 01 09:43:45 compute-2 sudo[58059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:45 compute-2 python3.9[58061]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582224.7604392-723-179384024438917/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:43:45 compute-2 sudo[58059]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:47 compute-2 sudo[58213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmekqqsdaaioimthkgzgpbmddmsuydfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582227.0089934-786-136481856463433/AnsiballZ_lineinfile.py'
Dec 01 09:43:47 compute-2 sudo[58213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:47 compute-2 python3.9[58215]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:43:47 compute-2 sudo[58213]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:49 compute-2 sudo[58367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiuoxuqhjrdgcofnnilzhjxsyvisfuaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582228.804616-830-220839678784681/AnsiballZ_setup.py'
Dec 01 09:43:49 compute-2 sudo[58367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:49 compute-2 python3.9[58369]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:43:49 compute-2 sudo[58367]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:50 compute-2 sudo[58451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhnwjikputgfrmrjnirrhyobpzqjamvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582228.804616-830-220839678784681/AnsiballZ_systemd.py'
Dec 01 09:43:50 compute-2 sudo[58451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:50 compute-2 python3.9[58453]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:43:50 compute-2 sudo[58451]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:51 compute-2 sudo[58605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbayeyldphktuxrzqpusokvnkvyhoerf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582231.6826534-879-275683216807151/AnsiballZ_setup.py'
Dec 01 09:43:51 compute-2 sudo[58605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:52 compute-2 python3.9[58607]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:43:52 compute-2 sudo[58605]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:52 compute-2 sudo[58689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwdanhfwyzhjudjrdwxsiutxrejasshm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582231.6826534-879-275683216807151/AnsiballZ_systemd.py'
Dec 01 09:43:52 compute-2 sudo[58689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:43:53 compute-2 python3.9[58691]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:43:53 compute-2 chronyd[788]: chronyd exiting
Dec 01 09:43:53 compute-2 systemd[1]: Stopping NTP client/server...
Dec 01 09:43:53 compute-2 systemd[1]: chronyd.service: Deactivated successfully.
Dec 01 09:43:53 compute-2 systemd[1]: Stopped NTP client/server.
Dec 01 09:43:53 compute-2 systemd[1]: Starting NTP client/server...
Dec 01 09:43:53 compute-2 chronyd[58700]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 01 09:43:53 compute-2 chronyd[58700]: Frequency -23.796 +/- 0.471 ppm read from /var/lib/chrony/drift
Dec 01 09:43:53 compute-2 chronyd[58700]: Loaded seccomp filter (level 2)
Dec 01 09:43:53 compute-2 systemd[1]: Started NTP client/server.
Dec 01 09:43:53 compute-2 sudo[58689]: pam_unix(sudo:session): session closed for user root
Dec 01 09:43:53 compute-2 sshd-session[53747]: Connection closed by 192.168.122.30 port 36228
Dec 01 09:43:53 compute-2 sshd-session[53744]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:43:53 compute-2 systemd[1]: session-13.scope: Deactivated successfully.
Dec 01 09:43:53 compute-2 systemd[1]: session-13.scope: Consumed 26.524s CPU time.
Dec 01 09:43:53 compute-2 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Dec 01 09:43:53 compute-2 systemd-logind[795]: Removed session 13.
Dec 01 09:44:00 compute-2 sshd-session[58726]: Invalid user rahul from 14.22.89.30 port 47978
Dec 01 09:44:00 compute-2 sshd-session[58726]: Received disconnect from 14.22.89.30 port 47978:11: Bye Bye [preauth]
Dec 01 09:44:00 compute-2 sshd-session[58726]: Disconnected from invalid user rahul 14.22.89.30 port 47978 [preauth]
Dec 01 09:44:04 compute-2 sshd-session[58728]: Accepted publickey for zuul from 192.168.122.30 port 39252 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:44:04 compute-2 systemd-logind[795]: New session 14 of user zuul.
Dec 01 09:44:04 compute-2 systemd[1]: Started Session 14 of User zuul.
Dec 01 09:44:04 compute-2 sshd-session[58728]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:44:05 compute-2 sudo[58881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkazvkoldbauvusxwkycnfaarxazfpah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582244.9095902-28-121580536831034/AnsiballZ_file.py'
Dec 01 09:44:05 compute-2 sudo[58881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:05 compute-2 python3.9[58883]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:05 compute-2 sudo[58881]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:06 compute-2 sudo[59033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgtbxuhqcimvvyhikvtjwedojwoauchf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582246.1289024-65-7868153935178/AnsiballZ_stat.py'
Dec 01 09:44:06 compute-2 sudo[59033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:06 compute-2 python3.9[59035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:06 compute-2 sudo[59033]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-2 sudo[59156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccifydondmmgmrbbsekpmgcgvadvlgmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582246.1289024-65-7868153935178/AnsiballZ_copy.py'
Dec 01 09:44:07 compute-2 sudo[59156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:07 compute-2 python3.9[59158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582246.1289024-65-7868153935178/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:07 compute-2 sudo[59156]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:07 compute-2 sshd-session[58731]: Connection closed by 192.168.122.30 port 39252
Dec 01 09:44:07 compute-2 sshd-session[58728]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:44:07 compute-2 systemd[1]: session-14.scope: Deactivated successfully.
Dec 01 09:44:07 compute-2 systemd[1]: session-14.scope: Consumed 1.551s CPU time.
Dec 01 09:44:07 compute-2 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Dec 01 09:44:07 compute-2 systemd-logind[795]: Removed session 14.
Dec 01 09:44:08 compute-2 sshd-session[59183]: Invalid user qw from 102.213.183.66 port 37998
Dec 01 09:44:09 compute-2 sshd-session[59183]: Received disconnect from 102.213.183.66 port 37998:11: Bye Bye [preauth]
Dec 01 09:44:09 compute-2 sshd-session[59183]: Disconnected from invalid user qw 102.213.183.66 port 37998 [preauth]
Dec 01 09:44:13 compute-2 sshd-session[59186]: Accepted publickey for zuul from 192.168.122.30 port 33358 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:44:13 compute-2 systemd-logind[795]: New session 15 of user zuul.
Dec 01 09:44:13 compute-2 systemd[1]: Started Session 15 of User zuul.
Dec 01 09:44:13 compute-2 sshd-session[59186]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:44:14 compute-2 python3.9[59339]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:44:15 compute-2 sudo[59493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtfmqldktyvkzfwyibvzigxlrvguiwzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582255.3045692-61-274653404431933/AnsiballZ_file.py'
Dec 01 09:44:15 compute-2 sudo[59493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:15 compute-2 python3.9[59495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:15 compute-2 sudo[59493]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:16 compute-2 sudo[59668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblbogulyooddujuezgvtfrjpdebuzfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582256.218627-85-107696451623044/AnsiballZ_stat.py'
Dec 01 09:44:16 compute-2 sudo[59668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:16 compute-2 python3.9[59670]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:16 compute-2 sudo[59668]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:17 compute-2 sudo[59791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafqjgnnxsyjuiuvmmjllosdztcdxhvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582256.218627-85-107696451623044/AnsiballZ_copy.py'
Dec 01 09:44:17 compute-2 sudo[59791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:17 compute-2 python3.9[59793]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764582256.218627-85-107696451623044/.source.json _original_basename=._ehl4dl5 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:17 compute-2 sudo[59791]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:18 compute-2 sudo[59943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbqakttbifutcjlpyiyukdxhmdpoybvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582258.056844-154-123245050801623/AnsiballZ_stat.py'
Dec 01 09:44:18 compute-2 sudo[59943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:18 compute-2 python3.9[59945]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:18 compute-2 sudo[59943]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:18 compute-2 sudo[60066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxbqadghvmdxhtclgoyzjnhrzektxqse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582258.056844-154-123245050801623/AnsiballZ_copy.py'
Dec 01 09:44:18 compute-2 sudo[60066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:18 compute-2 python3.9[60068]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582258.056844-154-123245050801623/.source _original_basename=.58pim4y_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:19 compute-2 sudo[60066]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:19 compute-2 sudo[60218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrbtpdomaaesvbjslwqmqfyxtjslfqiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582259.4817617-202-45462796612214/AnsiballZ_file.py'
Dec 01 09:44:19 compute-2 sudo[60218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:19 compute-2 python3.9[60220]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:44:19 compute-2 sudo[60218]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:20 compute-2 sudo[60370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eitykkxdupcdkoazsgclltryplawwquu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582260.2239485-226-25038340448245/AnsiballZ_stat.py'
Dec 01 09:44:20 compute-2 sudo[60370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:20 compute-2 python3.9[60372]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:20 compute-2 sudo[60370]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:20 compute-2 sudo[60493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wybwxiqtwiufukeqeefrkvabckukczoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582260.2239485-226-25038340448245/AnsiballZ_copy.py'
Dec 01 09:44:20 compute-2 sudo[60493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:21 compute-2 python3.9[60495]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582260.2239485-226-25038340448245/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:44:21 compute-2 sudo[60493]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:21 compute-2 sudo[60645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euedcobkwflkhjszgzzpeggxeaxnkuzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582261.2860045-226-43035152132898/AnsiballZ_stat.py'
Dec 01 09:44:21 compute-2 sudo[60645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:21 compute-2 python3.9[60647]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:21 compute-2 sudo[60645]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:21 compute-2 sshd-session[59185]: error: kex_exchange_identification: read: Connection timed out
Dec 01 09:44:21 compute-2 sshd-session[59185]: banner exchange: Connection from 222.73.135.87 port 36184: Connection timed out
Dec 01 09:44:22 compute-2 sudo[60768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygslgmexxeqdsbipacmgjananpbqqcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582261.2860045-226-43035152132898/AnsiballZ_copy.py'
Dec 01 09:44:22 compute-2 sudo[60768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:22 compute-2 python3.9[60770]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582261.2860045-226-43035152132898/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:44:22 compute-2 sudo[60768]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:23 compute-2 sudo[60920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbshvwtklxykupngyroevxyhjhnclmwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582262.777269-314-198184328799037/AnsiballZ_file.py'
Dec 01 09:44:23 compute-2 sudo[60920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:23 compute-2 python3.9[60922]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:23 compute-2 sudo[60920]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:23 compute-2 sudo[61072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcatgmbvtzbubktxffpxbwifovstwdjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582263.4706674-338-231400544038702/AnsiballZ_stat.py'
Dec 01 09:44:23 compute-2 sudo[61072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:23 compute-2 python3.9[61074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:23 compute-2 sudo[61072]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:24 compute-2 sudo[61195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvmluwajpxpektptbycqjrsfqdxbuhjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582263.4706674-338-231400544038702/AnsiballZ_copy.py'
Dec 01 09:44:24 compute-2 sudo[61195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:24 compute-2 python3.9[61197]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582263.4706674-338-231400544038702/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:24 compute-2 sudo[61195]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:25 compute-2 sudo[61347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yakwydbjjeaeypigemhtftuvgaonjsbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582265.3237746-383-151428863721463/AnsiballZ_stat.py'
Dec 01 09:44:25 compute-2 sudo[61347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:25 compute-2 python3.9[61349]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:25 compute-2 sudo[61347]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:26 compute-2 sudo[61470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhgnvzvnijkdpcnysixydaopmflbwzco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582265.3237746-383-151428863721463/AnsiballZ_copy.py'
Dec 01 09:44:26 compute-2 sudo[61470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:26 compute-2 python3.9[61472]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582265.3237746-383-151428863721463/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:26 compute-2 sudo[61470]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:27 compute-2 sudo[61622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dujthwgoewtacdeinbbucwyijmuerptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582266.664583-427-115404860764177/AnsiballZ_systemd.py'
Dec 01 09:44:27 compute-2 sudo[61622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:27 compute-2 python3.9[61624]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:44:27 compute-2 systemd[1]: Reloading.
Dec 01 09:44:27 compute-2 systemd-rc-local-generator[61652]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:27 compute-2 systemd-sysv-generator[61656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:27 compute-2 systemd[1]: Reloading.
Dec 01 09:44:27 compute-2 systemd-sysv-generator[61691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:27 compute-2 systemd-rc-local-generator[61688]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:28 compute-2 systemd[1]: Starting EDPM Container Shutdown...
Dec 01 09:44:28 compute-2 systemd[1]: Finished EDPM Container Shutdown.
Dec 01 09:44:28 compute-2 sudo[61622]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:28 compute-2 sudo[61848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grzhznytmlucnxylbqpaficcjknvuulu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582268.3473897-452-13746171790481/AnsiballZ_stat.py'
Dec 01 09:44:28 compute-2 sudo[61848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:28 compute-2 python3.9[61850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:28 compute-2 sudo[61848]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:29 compute-2 sudo[61971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnbsnnisorkwdpxjiqnylysgwzxztgde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582268.3473897-452-13746171790481/AnsiballZ_copy.py'
Dec 01 09:44:29 compute-2 sudo[61971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:29 compute-2 python3.9[61973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582268.3473897-452-13746171790481/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:29 compute-2 sudo[61971]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:29 compute-2 sudo[62123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfccxuzxwdiwtafuizdfkdyptmvjhqhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582269.6281362-497-119021787803109/AnsiballZ_stat.py'
Dec 01 09:44:29 compute-2 sudo[62123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:30 compute-2 python3.9[62125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:30 compute-2 sudo[62123]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:30 compute-2 sudo[62246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjexdwcgxghplzqafpvzliegifgxgwmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582269.6281362-497-119021787803109/AnsiballZ_copy.py'
Dec 01 09:44:30 compute-2 sudo[62246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:30 compute-2 python3.9[62248]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582269.6281362-497-119021787803109/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:30 compute-2 sudo[62246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:31 compute-2 sudo[62398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhseamhodygngvyhcdomnwrsojoyabdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582270.932772-542-88511046118431/AnsiballZ_systemd.py'
Dec 01 09:44:31 compute-2 sudo[62398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:31 compute-2 python3.9[62400]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:44:31 compute-2 systemd[1]: Reloading.
Dec 01 09:44:31 compute-2 systemd-rc-local-generator[62427]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:31 compute-2 systemd-sysv-generator[62430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:31 compute-2 systemd[1]: Reloading.
Dec 01 09:44:31 compute-2 systemd-rc-local-generator[62464]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:31 compute-2 systemd-sysv-generator[62468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:32 compute-2 systemd[1]: Starting Create netns directory...
Dec 01 09:44:32 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:44:32 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:44:32 compute-2 systemd[1]: Finished Create netns directory.
Dec 01 09:44:32 compute-2 sudo[62398]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:32 compute-2 python3.9[62626]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:44:32 compute-2 network[62643]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:44:32 compute-2 network[62644]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:44:32 compute-2 network[62645]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:44:37 compute-2 sudo[62905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfqfntrxvpcmijfomsjaxxshcyderqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582276.8699937-589-256234059498512/AnsiballZ_systemd.py'
Dec 01 09:44:37 compute-2 sudo[62905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:37 compute-2 python3.9[62907]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:44:37 compute-2 systemd[1]: Reloading.
Dec 01 09:44:37 compute-2 systemd-rc-local-generator[62931]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:37 compute-2 systemd-sysv-generator[62938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:37 compute-2 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 01 09:44:38 compute-2 iptables.init[62947]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 01 09:44:38 compute-2 iptables.init[62947]: iptables: Flushing firewall rules: [  OK  ]
Dec 01 09:44:38 compute-2 systemd[1]: iptables.service: Deactivated successfully.
Dec 01 09:44:38 compute-2 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 01 09:44:38 compute-2 sudo[62905]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:38 compute-2 sudo[63142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjjhuccnlzranyctzoktjkmdqhuyhqgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582278.3416824-589-33246068296819/AnsiballZ_systemd.py'
Dec 01 09:44:38 compute-2 sudo[63142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:38 compute-2 python3.9[63144]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:44:38 compute-2 sudo[63142]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:39 compute-2 sudo[63296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuvdejrlyqvvdticuxhdtnhkloqnbcgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582279.3281403-637-164560638371295/AnsiballZ_systemd.py'
Dec 01 09:44:39 compute-2 sudo[63296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:39 compute-2 python3.9[63298]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:44:39 compute-2 systemd[1]: Reloading.
Dec 01 09:44:39 compute-2 systemd-rc-local-generator[63327]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:44:39 compute-2 systemd-sysv-generator[63330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:44:40 compute-2 systemd[1]: Starting Netfilter Tables...
Dec 01 09:44:40 compute-2 systemd[1]: Finished Netfilter Tables.
Dec 01 09:44:40 compute-2 sudo[63296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:41 compute-2 sudo[63487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghszvjvpmdwkyilgubwgenczzyjcznqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582280.629527-662-2785214366583/AnsiballZ_command.py'
Dec 01 09:44:41 compute-2 sudo[63487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:41 compute-2 python3.9[63489]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:44:41 compute-2 sudo[63487]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:42 compute-2 sudo[63640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgklwetqjblljvjpytdgyjgisnmkemfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582281.8411167-704-106298578853234/AnsiballZ_stat.py'
Dec 01 09:44:42 compute-2 sudo[63640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:42 compute-2 python3.9[63642]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:42 compute-2 sudo[63640]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:42 compute-2 sudo[63765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odeezreyuocasdttskjnrmifqtfwsblz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582281.8411167-704-106298578853234/AnsiballZ_copy.py'
Dec 01 09:44:42 compute-2 sudo[63765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:43 compute-2 python3.9[63767]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582281.8411167-704-106298578853234/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:43 compute-2 sudo[63765]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:43 compute-2 sudo[63918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etvdfdetreeuorbzyuojqfmfbuopnpde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582283.5753698-749-210715365964906/AnsiballZ_systemd.py'
Dec 01 09:44:43 compute-2 sudo[63918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:44 compute-2 python3.9[63920]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:44:44 compute-2 systemd[1]: Reloading OpenSSH server daemon...
Dec 01 09:44:44 compute-2 sshd[1008]: Received SIGHUP; restarting.
Dec 01 09:44:44 compute-2 systemd[1]: Reloaded OpenSSH server daemon.
Dec 01 09:44:44 compute-2 sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 01 09:44:44 compute-2 sshd[1008]: Server listening on :: port 22.
Dec 01 09:44:44 compute-2 sudo[63918]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:44 compute-2 sudo[64074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjoniuiqddypfkwdmwlorexzaurpzxon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582284.4961715-773-172688733853917/AnsiballZ_file.py'
Dec 01 09:44:44 compute-2 sudo[64074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:44 compute-2 python3.9[64076]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:44 compute-2 sudo[64074]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:45 compute-2 sudo[64226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkavvuqmdqqhhrfakhppkjygwbeolld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582285.2901804-797-253898812622954/AnsiballZ_stat.py'
Dec 01 09:44:45 compute-2 sudo[64226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:45 compute-2 python3.9[64228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:45 compute-2 sudo[64226]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:46 compute-2 sudo[64349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnnjsplkjylqzpqxjfdnbqeydiktvqrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582285.2901804-797-253898812622954/AnsiballZ_copy.py'
Dec 01 09:44:46 compute-2 sudo[64349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:46 compute-2 python3.9[64351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582285.2901804-797-253898812622954/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:46 compute-2 sudo[64349]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:47 compute-2 sudo[64501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdlxwitlikjvajsfutpaftfhrkwtttsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582287.0275843-851-254070056079367/AnsiballZ_timezone.py'
Dec 01 09:44:47 compute-2 sudo[64501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:47 compute-2 python3.9[64503]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 09:44:47 compute-2 systemd[1]: Starting Time & Date Service...
Dec 01 09:44:47 compute-2 systemd[1]: Started Time & Date Service.
Dec 01 09:44:47 compute-2 sudo[64501]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:48 compute-2 sudo[64657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toewuxdstaqemzmivztarwmbvzxovsvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582288.072704-878-137823587145318/AnsiballZ_file.py'
Dec 01 09:44:48 compute-2 sudo[64657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:48 compute-2 python3.9[64659]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:48 compute-2 sudo[64657]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:49 compute-2 sudo[64809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-catwvdoulrpsrngeswxefchzohztomio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582288.8387759-902-100209682202105/AnsiballZ_stat.py'
Dec 01 09:44:49 compute-2 sudo[64809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:49 compute-2 python3.9[64811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:49 compute-2 sudo[64809]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:49 compute-2 sudo[64932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjbbxujyuiuhpfgtjhrszdmqldkiniww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582288.8387759-902-100209682202105/AnsiballZ_copy.py'
Dec 01 09:44:49 compute-2 sudo[64932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:49 compute-2 python3.9[64934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582288.8387759-902-100209682202105/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:49 compute-2 sudo[64932]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:50 compute-2 sudo[65084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycwbeiqlkvfdllrdnebdndqgfbzdhix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582290.2396731-947-182848910575357/AnsiballZ_stat.py'
Dec 01 09:44:50 compute-2 sudo[65084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:50 compute-2 python3.9[65086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:50 compute-2 sudo[65084]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:51 compute-2 sudo[65207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzhrpeirezyrvuqgbvnzwhigqmugcacg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582290.2396731-947-182848910575357/AnsiballZ_copy.py'
Dec 01 09:44:51 compute-2 sudo[65207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:51 compute-2 python3.9[65209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582290.2396731-947-182848910575357/.source.yaml _original_basename=.7vpwio7n follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:51 compute-2 sudo[65207]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:52 compute-2 sudo[65359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urlkqasyudrgbzwzkreghdptowmqdnjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582291.7708483-992-272543069536240/AnsiballZ_stat.py'
Dec 01 09:44:52 compute-2 sudo[65359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:52 compute-2 python3.9[65361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:52 compute-2 sudo[65359]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:52 compute-2 sudo[65482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwbbjlaefswdnoubpnhexrjboyhmuljb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582291.7708483-992-272543069536240/AnsiballZ_copy.py'
Dec 01 09:44:52 compute-2 sudo[65482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:52 compute-2 python3.9[65484]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582291.7708483-992-272543069536240/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:52 compute-2 sudo[65482]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:53 compute-2 sudo[65634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqllhvnzfoooqpirgfpgjgifpznohzku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582293.0095153-1037-220361336071733/AnsiballZ_command.py'
Dec 01 09:44:53 compute-2 sudo[65634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:53 compute-2 python3.9[65636]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:44:53 compute-2 sudo[65634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:54 compute-2 sudo[65787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhbzrhjvaxmtubhvuuiajtsixtzvxien ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582293.757126-1061-174515898798031/AnsiballZ_command.py'
Dec 01 09:44:54 compute-2 sudo[65787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:54 compute-2 python3.9[65789]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:44:54 compute-2 sudo[65787]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:54 compute-2 sudo[65940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkujdvwmtrmojpifsssfdmswscgdahz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764582294.4615097-1085-192321581026580/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:44:54 compute-2 sudo[65940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:55 compute-2 python3[65942]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:44:55 compute-2 sudo[65940]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:55 compute-2 sudo[66092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfjfmbydvgyzclpjpfazzuiuyqmamfgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582295.3808181-1109-233373051327969/AnsiballZ_stat.py'
Dec 01 09:44:55 compute-2 sudo[66092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:55 compute-2 python3.9[66094]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:55 compute-2 sudo[66092]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:56 compute-2 sudo[66215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwjcjmmximhvnbzkrlvspbaehlsuiohh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582295.3808181-1109-233373051327969/AnsiballZ_copy.py'
Dec 01 09:44:56 compute-2 sudo[66215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:56 compute-2 python3.9[66217]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582295.3808181-1109-233373051327969/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:56 compute-2 sudo[66215]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:57 compute-2 sudo[66367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzmileslohqroyjfxdocruuqoyhjhhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582296.8511138-1154-120746740444239/AnsiballZ_stat.py'
Dec 01 09:44:57 compute-2 sudo[66367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:57 compute-2 python3.9[66369]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:57 compute-2 sudo[66367]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:57 compute-2 sudo[66490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyckfcimmemiehxgddkqldgaiyjymtpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582296.8511138-1154-120746740444239/AnsiballZ_copy.py'
Dec 01 09:44:57 compute-2 sudo[66490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:57 compute-2 python3.9[66492]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582296.8511138-1154-120746740444239/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:57 compute-2 sudo[66490]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:58 compute-2 sudo[66642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaupkkoplfmuxvxolpsxuvpndfoydtbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582298.1721904-1199-156316518391364/AnsiballZ_stat.py'
Dec 01 09:44:58 compute-2 sudo[66642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:58 compute-2 python3.9[66644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:58 compute-2 sudo[66642]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:58 compute-2 sudo[66765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvoenynuyglitmdxxziuazonxppzindt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582298.1721904-1199-156316518391364/AnsiballZ_copy.py'
Dec 01 09:44:58 compute-2 sudo[66765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:59 compute-2 python3.9[66767]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582298.1721904-1199-156316518391364/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:44:59 compute-2 sudo[66765]: pam_unix(sudo:session): session closed for user root
Dec 01 09:44:59 compute-2 sudo[66917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lchuktyctsadwamytslmyolncwkezube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582299.4919884-1244-246835608618934/AnsiballZ_stat.py'
Dec 01 09:44:59 compute-2 sudo[66917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:44:59 compute-2 python3.9[66919]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:44:59 compute-2 sudo[66917]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:00 compute-2 sudo[67040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcrvhvgmhxewdekevuhepioofzqgibbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582299.4919884-1244-246835608618934/AnsiballZ_copy.py'
Dec 01 09:45:00 compute-2 sudo[67040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:00 compute-2 python3.9[67042]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582299.4919884-1244-246835608618934/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:00 compute-2 sudo[67040]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:01 compute-2 sudo[67192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amysabzxwrcoynpweuzehcwqmtecddgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582300.804938-1289-202780742107880/AnsiballZ_stat.py'
Dec 01 09:45:01 compute-2 sudo[67192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:01 compute-2 python3.9[67194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:45:01 compute-2 sudo[67192]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:01 compute-2 sudo[67315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimdwznifjovkdiuhhblxorjplrcyfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582300.804938-1289-202780742107880/AnsiballZ_copy.py'
Dec 01 09:45:01 compute-2 sudo[67315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:01 compute-2 python3.9[67317]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582300.804938-1289-202780742107880/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:01 compute-2 sudo[67315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:02 compute-2 sudo[67467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqnsyyefonyeuxvdohbpwflracurigyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582302.219669-1334-6697691988722/AnsiballZ_file.py'
Dec 01 09:45:02 compute-2 sudo[67467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:02 compute-2 python3.9[67469]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:02 compute-2 sudo[67467]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:03 compute-2 sudo[67619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqjfrjdcnhurkubclebnkueurejzlgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582302.9098523-1358-195924642061110/AnsiballZ_command.py'
Dec 01 09:45:03 compute-2 sudo[67619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:03 compute-2 python3.9[67621]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:45:03 compute-2 sudo[67619]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:04 compute-2 sudo[67778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igbdbduinrindfbokbrfinkiaaobzcng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582303.648209-1383-74105682538834/AnsiballZ_blockinfile.py'
Dec 01 09:45:04 compute-2 sudo[67778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:04 compute-2 python3.9[67780]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:04 compute-2 sudo[67778]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:04 compute-2 sudo[67931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgkdsejyqyopxgszkswozrkossftaav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582304.7062852-1409-192955239213223/AnsiballZ_file.py'
Dec 01 09:45:04 compute-2 sudo[67931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:05 compute-2 python3.9[67933]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:05 compute-2 sudo[67931]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:05 compute-2 sudo[68083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jagbbtjiqdrhcgjxqfstkqgvnhkyqxnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582305.3074243-1409-95674058787391/AnsiballZ_file.py'
Dec 01 09:45:05 compute-2 sudo[68083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:05 compute-2 python3.9[68085]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:05 compute-2 sudo[68083]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:06 compute-2 sudo[68235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxdviftnnpvejfnnbiwdcpxmthzrril ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582306.1002648-1454-128576116587055/AnsiballZ_mount.py'
Dec 01 09:45:06 compute-2 sudo[68235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:06 compute-2 python3.9[68237]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:45:06 compute-2 sudo[68235]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:07 compute-2 sudo[68388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oouvrgqbxbzfjvvbifgbegrqlldsmzfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582306.8442187-1454-154776069625913/AnsiballZ_mount.py'
Dec 01 09:45:07 compute-2 sudo[68388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:07 compute-2 python3.9[68390]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:45:07 compute-2 sudo[68388]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:07 compute-2 sshd-session[59189]: Connection closed by 192.168.122.30 port 33358
Dec 01 09:45:07 compute-2 sshd-session[59186]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:45:07 compute-2 systemd[1]: session-15.scope: Deactivated successfully.
Dec 01 09:45:07 compute-2 systemd[1]: session-15.scope: Consumed 32.826s CPU time.
Dec 01 09:45:07 compute-2 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Dec 01 09:45:07 compute-2 systemd-logind[795]: Removed session 15.
Dec 01 09:45:13 compute-2 sshd-session[68416]: Accepted publickey for zuul from 192.168.122.30 port 47336 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:45:13 compute-2 systemd-logind[795]: New session 16 of user zuul.
Dec 01 09:45:13 compute-2 systemd[1]: Started Session 16 of User zuul.
Dec 01 09:45:13 compute-2 sshd-session[68416]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:45:13 compute-2 sudo[68569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikfwsjwkigzkmxrntatthmquoomqkadi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582313.1904125-20-137659375539450/AnsiballZ_tempfile.py'
Dec 01 09:45:13 compute-2 sudo[68569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:13 compute-2 python3.9[68571]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 01 09:45:13 compute-2 sudo[68569]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:14 compute-2 sudo[68721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rggqayfujkowdlcwhqawvbdulggvovro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582314.1469111-56-147909749863776/AnsiballZ_stat.py'
Dec 01 09:45:14 compute-2 sudo[68721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:14 compute-2 python3.9[68723]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:45:14 compute-2 sudo[68721]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:15 compute-2 sudo[68873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcbidxuyolnfsdzygeyiyrlpfuqdjgct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582315.0962696-87-253667144116612/AnsiballZ_setup.py'
Dec 01 09:45:15 compute-2 sudo[68873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:15 compute-2 python3.9[68875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:45:15 compute-2 sudo[68873]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:16 compute-2 sudo[69025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feeqnjekkagoyvomtljczghtosrrxdib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582316.2308426-111-123733411819723/AnsiballZ_blockinfile.py'
Dec 01 09:45:16 compute-2 sudo[69025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:16 compute-2 python3.9[69027]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=
                                             create=True mode=0644 path=/tmp/ansible.leixzeb3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:16 compute-2 sudo[69025]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:17 compute-2 sudo[69177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnzuqlklgwgwwmtomwyvnpecryrclwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582317.102143-135-172137621820500/AnsiballZ_command.py'
Dec 01 09:45:17 compute-2 sudo[69177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:17 compute-2 python3.9[69179]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.leixzeb3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:45:17 compute-2 sudo[69177]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:17 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:45:18 compute-2 sudo[69333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfwmyoumpupiwfrqwffngtujniwjgsra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582317.9522834-159-206377193983312/AnsiballZ_file.py'
Dec 01 09:45:18 compute-2 sudo[69333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:19 compute-2 python3.9[69335]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.leixzeb3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:19 compute-2 sudo[69333]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:19 compute-2 sshd-session[68419]: Connection closed by 192.168.122.30 port 47336
Dec 01 09:45:19 compute-2 sshd-session[68416]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:45:19 compute-2 systemd[1]: session-16.scope: Deactivated successfully.
Dec 01 09:45:19 compute-2 systemd[1]: session-16.scope: Consumed 3.224s CPU time.
Dec 01 09:45:19 compute-2 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Dec 01 09:45:19 compute-2 systemd-logind[795]: Removed session 16.
Dec 01 09:45:22 compute-2 sshd-session[69361]: Invalid user dev from 102.213.183.66 port 35202
Dec 01 09:45:22 compute-2 sshd-session[69361]: Received disconnect from 102.213.183.66 port 35202:11: Bye Bye [preauth]
Dec 01 09:45:22 compute-2 sshd-session[69361]: Disconnected from invalid user dev 102.213.183.66 port 35202 [preauth]
Dec 01 09:45:25 compute-2 sshd-session[69363]: Accepted publickey for zuul from 192.168.122.30 port 56094 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:45:25 compute-2 systemd-logind[795]: New session 17 of user zuul.
Dec 01 09:45:25 compute-2 systemd[1]: Started Session 17 of User zuul.
Dec 01 09:45:25 compute-2 sshd-session[69363]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:45:26 compute-2 python3.9[69516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:45:27 compute-2 sudo[69670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfdacrsfalycsdrisldhguopjyicfskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582326.6966789-59-231187207400597/AnsiballZ_systemd.py'
Dec 01 09:45:27 compute-2 sudo[69670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:27 compute-2 python3.9[69672]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 09:45:27 compute-2 sudo[69670]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:28 compute-2 sudo[69824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrdclmcsfupmkkiexlrmzxvixljxifee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582328.7815893-82-254400538867375/AnsiballZ_systemd.py'
Dec 01 09:45:28 compute-2 sudo[69824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:29 compute-2 python3.9[69826]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:45:29 compute-2 sudo[69824]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:30 compute-2 sudo[69978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kenkyytrgcuzmhpbwxckvupkoojclztc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582329.7641973-110-89746647623969/AnsiballZ_command.py'
Dec 01 09:45:30 compute-2 sudo[69978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:30 compute-2 python3.9[69980]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:45:30 compute-2 sudo[69978]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:31 compute-2 sudo[70131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-autddpxnaxioqpxeoakpaphasvtsixjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582330.6801195-134-268030910592451/AnsiballZ_stat.py'
Dec 01 09:45:31 compute-2 sudo[70131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:31 compute-2 python3.9[70133]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:45:31 compute-2 sudo[70131]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:31 compute-2 sudo[70285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkfkwzjmmnayjxtbwjnwquohtvuojng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582331.5511825-157-3743801066903/AnsiballZ_command.py'
Dec 01 09:45:31 compute-2 sudo[70285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:31 compute-2 python3.9[70287]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:45:32 compute-2 sudo[70285]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:32 compute-2 sudo[70440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irubcbiodqvsxoaxakytifzryouubiwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582332.2691407-181-145612577415722/AnsiballZ_file.py'
Dec 01 09:45:32 compute-2 sudo[70440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:32 compute-2 python3.9[70442]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:45:32 compute-2 sudo[70440]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:33 compute-2 sshd-session[69366]: Connection closed by 192.168.122.30 port 56094
Dec 01 09:45:33 compute-2 sshd-session[69363]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:45:33 compute-2 systemd[1]: session-17.scope: Deactivated successfully.
Dec 01 09:45:33 compute-2 systemd[1]: session-17.scope: Consumed 4.280s CPU time.
Dec 01 09:45:33 compute-2 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Dec 01 09:45:33 compute-2 systemd-logind[795]: Removed session 17.
Dec 01 09:45:38 compute-2 sshd-session[70467]: Accepted publickey for zuul from 192.168.122.30 port 56748 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:45:38 compute-2 systemd-logind[795]: New session 18 of user zuul.
Dec 01 09:45:38 compute-2 systemd[1]: Started Session 18 of User zuul.
Dec 01 09:45:38 compute-2 sshd-session[70467]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:45:39 compute-2 python3.9[70620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:45:40 compute-2 sudo[70774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezrmcwbyzniwmxrbguhzvgsninwqmti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582339.9129863-64-221155307406236/AnsiballZ_setup.py'
Dec 01 09:45:40 compute-2 sudo[70774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:40 compute-2 sshd-session[69904]: error: kex_exchange_identification: read: Connection timed out
Dec 01 09:45:40 compute-2 sshd-session[69904]: banner exchange: Connection from 14.22.89.30 port 49110: Connection timed out
Dec 01 09:45:40 compute-2 python3.9[70776]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:45:40 compute-2 sudo[70774]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:41 compute-2 sudo[70858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpvbrxpbpgtpwwpgnhoxexmvjxaekjsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582339.9129863-64-221155307406236/AnsiballZ_dnf.py'
Dec 01 09:45:41 compute-2 sudo[70858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:41 compute-2 python3.9[70860]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:45:42 compute-2 sudo[70858]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:43 compute-2 python3.9[71011]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:45:44 compute-2 python3.9[71162]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:45:45 compute-2 python3.9[71312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:45:45 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:45:45 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:45:46 compute-2 python3.9[71463]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:45:47 compute-2 sshd-session[70470]: Connection closed by 192.168.122.30 port 56748
Dec 01 09:45:47 compute-2 sshd-session[70467]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:45:47 compute-2 systemd[1]: session-18.scope: Deactivated successfully.
Dec 01 09:45:47 compute-2 systemd[1]: session-18.scope: Consumed 5.614s CPU time.
Dec 01 09:45:47 compute-2 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Dec 01 09:45:47 compute-2 systemd-logind[795]: Removed session 18.
Dec 01 09:45:55 compute-2 sshd-session[71488]: Accepted publickey for zuul from 38.102.83.143 port 55982 ssh2: RSA SHA256:A8KzWK46IZ9u9VeBeLMGXVv9yesAJ5sUIau6zdZZ9P8
Dec 01 09:45:55 compute-2 systemd-logind[795]: New session 19 of user zuul.
Dec 01 09:45:55 compute-2 systemd[1]: Started Session 19 of User zuul.
Dec 01 09:45:55 compute-2 sshd-session[71488]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:45:55 compute-2 sudo[71564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebvnjzpqrypcmsqgovpwvspkqowcsesh ; /usr/bin/python3'
Dec 01 09:45:55 compute-2 sudo[71564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:55 compute-2 useradd[71568]: new group: name=ceph-admin, GID=42478
Dec 01 09:45:55 compute-2 useradd[71568]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 01 09:45:55 compute-2 sudo[71564]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:56 compute-2 sudo[71650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceserrnxjyrfmoispdfgdbvuljkouklt ; /usr/bin/python3'
Dec 01 09:45:56 compute-2 sudo[71650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:56 compute-2 sudo[71650]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:56 compute-2 sudo[71723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqchcojcaqsosixoundgyonfszqalqxk ; /usr/bin/python3'
Dec 01 09:45:56 compute-2 sudo[71723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:56 compute-2 sudo[71723]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:57 compute-2 sudo[71773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbltsyqgsfrcgrhvbimbrlnyejobpqo ; /usr/bin/python3'
Dec 01 09:45:57 compute-2 sudo[71773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:57 compute-2 sudo[71773]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:58 compute-2 sudo[71799]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnwdjknkbgjgjmfqntgxztiwbkrulpqs ; /usr/bin/python3'
Dec 01 09:45:58 compute-2 sudo[71799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:58 compute-2 sudo[71799]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:58 compute-2 sudo[71825]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpwadpdkdfwajktxmzvpagbqmkhnuqej ; /usr/bin/python3'
Dec 01 09:45:58 compute-2 sudo[71825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:58 compute-2 sudo[71825]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:59 compute-2 sudo[71851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbsjcalaamnkgxpykxzpaoqnsmkmnfs ; /usr/bin/python3'
Dec 01 09:45:59 compute-2 sudo[71851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:59 compute-2 sudo[71851]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:59 compute-2 sudo[71929]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqngyboismoonovslmlzrxwpdywkiar ; /usr/bin/python3'
Dec 01 09:45:59 compute-2 sudo[71929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:59 compute-2 sudo[71929]: pam_unix(sudo:session): session closed for user root
Dec 01 09:45:59 compute-2 sudo[72002]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcarjnahitfwsqethebejxxkhzblqtlr ; /usr/bin/python3'
Dec 01 09:45:59 compute-2 sudo[72002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:45:59 compute-2 sudo[72002]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:00 compute-2 sudo[72104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edtknmcwikkzegdcbwscbbsylimiqzzc ; /usr/bin/python3'
Dec 01 09:46:00 compute-2 sudo[72104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:00 compute-2 sudo[72104]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:00 compute-2 sudo[72177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvzggzgtksdiilwncyvqzgqxyhefnfso ; /usr/bin/python3'
Dec 01 09:46:00 compute-2 sudo[72177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:00 compute-2 sudo[72177]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:01 compute-2 sudo[72227]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twdfiukoqihlxhogzpkvhxmvxcwpyceh ; /usr/bin/python3'
Dec 01 09:46:01 compute-2 sudo[72227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:01 compute-2 python3[72229]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:46:02 compute-2 sudo[72227]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:03 compute-2 sudo[72322]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzvznbsihaydkorrghlitrtsppyogmo ; /usr/bin/python3'
Dec 01 09:46:03 compute-2 sudo[72322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:03 compute-2 chronyd[58700]: Selected source 149.56.19.163 (pool.ntp.org)
Dec 01 09:46:03 compute-2 python3[72324]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 01 09:46:04 compute-2 sudo[72322]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:05 compute-2 sudo[72349]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orjvvqbdvbxkmzodnwumgabtstcofjyq ; /usr/bin/python3'
Dec 01 09:46:05 compute-2 sudo[72349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:05 compute-2 python3[72351]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 01 09:46:05 compute-2 sudo[72349]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:05 compute-2 sudo[72375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhopehlzozayhhqrgvickvrkjqvczapn ; /usr/bin/python3'
Dec 01 09:46:05 compute-2 sudo[72375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:05 compute-2 python3[72377]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:46:05 compute-2 kernel: loop: module loaded
Dec 01 09:46:05 compute-2 kernel: loop3: detected capacity change from 0 to 41943040
Dec 01 09:46:05 compute-2 sudo[72375]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:05 compute-2 sudo[72410]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgifvmzcpdzlekvjokpzpnhtepdzauzc ; /usr/bin/python3'
Dec 01 09:46:05 compute-2 sudo[72410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:06 compute-2 python3[72412]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:46:06 compute-2 lvm[72415]: PV /dev/loop3 not used.
Dec 01 09:46:06 compute-2 lvm[72424]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:46:06 compute-2 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 01 09:46:06 compute-2 lvm[72426]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 01 09:46:06 compute-2 sudo[72410]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:06 compute-2 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 01 09:46:06 compute-2 sudo[72502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjeimkmghomqjaksficljfiuplomtpuq ; /usr/bin/python3'
Dec 01 09:46:06 compute-2 sudo[72502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:06 compute-2 python3[72504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 01 09:46:06 compute-2 sudo[72502]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:07 compute-2 sudo[72575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbizvwzuxwevrzvauyyrborqjinjdkmi ; /usr/bin/python3'
Dec 01 09:46:07 compute-2 sudo[72575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:07 compute-2 python3[72577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764582366.6851885-36893-167555823505866/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:46:07 compute-2 sudo[72575]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:07 compute-2 sudo[72625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxwvlksmseibiwsbmqvjpfpkvomfbrix ; /usr/bin/python3'
Dec 01 09:46:07 compute-2 sudo[72625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:46:08 compute-2 python3[72627]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:46:08 compute-2 systemd[1]: Reloading.
Dec 01 09:46:08 compute-2 systemd-rc-local-generator[72652]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:46:08 compute-2 systemd-sysv-generator[72658]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:46:08 compute-2 systemd[1]: Starting Ceph OSD losetup...
Dec 01 09:46:08 compute-2 bash[72666]: /dev/loop3: [64513]:4327939 (/var/lib/ceph-osd-0.img)
Dec 01 09:46:08 compute-2 systemd[1]: Finished Ceph OSD losetup.
Dec 01 09:46:08 compute-2 sudo[72625]: pam_unix(sudo:session): session closed for user root
Dec 01 09:46:08 compute-2 lvm[72668]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:46:08 compute-2 lvm[72668]: VG ceph_vg0 finished
Dec 01 09:46:11 compute-2 python3[72692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:46:16 compute-2 sshd-session[72736]: Invalid user astra from 45.78.219.119 port 42646
Dec 01 09:46:16 compute-2 sshd-session[72736]: Received disconnect from 45.78.219.119 port 42646:11: Bye Bye [preauth]
Dec 01 09:46:16 compute-2 sshd-session[72736]: Disconnected from invalid user astra 45.78.219.119 port 42646 [preauth]
Dec 01 09:46:34 compute-2 sshd-session[72739]: Received disconnect from 102.213.183.66 port 36282:11: Bye Bye [preauth]
Dec 01 09:46:34 compute-2 sshd-session[72739]: Disconnected from authenticating user root 102.213.183.66 port 36282 [preauth]
Dec 01 09:47:04 compute-2 sshd-session[72741]: Received disconnect from 14.22.89.30 port 44272:11: Bye Bye [preauth]
Dec 01 09:47:04 compute-2 sshd-session[72741]: Disconnected from authenticating user root 14.22.89.30 port 44272 [preauth]
Dec 01 09:47:45 compute-2 sshd-session[72743]: Accepted publickey for ceph-admin from 192.168.122.100 port 56480 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:45 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Dec 01 09:47:45 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 01 09:47:45 compute-2 systemd-logind[795]: New session 20 of user ceph-admin.
Dec 01 09:47:45 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 01 09:47:45 compute-2 systemd[1]: Starting User Manager for UID 42477...
Dec 01 09:47:45 compute-2 systemd[72747]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:45 compute-2 systemd[72747]: Queued start job for default target Main User Target.
Dec 01 09:47:45 compute-2 sshd-session[72761]: Accepted publickey for ceph-admin from 192.168.122.100 port 56492 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:45 compute-2 systemd-logind[795]: New session 22 of user ceph-admin.
Dec 01 09:47:45 compute-2 systemd[72747]: Created slice User Application Slice.
Dec 01 09:47:45 compute-2 systemd[72747]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:47:45 compute-2 systemd[72747]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:47:45 compute-2 systemd[72747]: Reached target Paths.
Dec 01 09:47:45 compute-2 systemd[72747]: Reached target Timers.
Dec 01 09:47:45 compute-2 systemd[72747]: Starting D-Bus User Message Bus Socket...
Dec 01 09:47:45 compute-2 systemd[72747]: Starting Create User's Volatile Files and Directories...
Dec 01 09:47:45 compute-2 systemd[72747]: Finished Create User's Volatile Files and Directories.
Dec 01 09:47:45 compute-2 systemd[72747]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:47:45 compute-2 systemd[72747]: Reached target Sockets.
Dec 01 09:47:45 compute-2 systemd[72747]: Reached target Basic System.
Dec 01 09:47:45 compute-2 systemd[72747]: Reached target Main User Target.
Dec 01 09:47:45 compute-2 systemd[72747]: Startup finished in 128ms.
Dec 01 09:47:45 compute-2 systemd[1]: Started User Manager for UID 42477.
Dec 01 09:47:45 compute-2 systemd[1]: Started Session 20 of User ceph-admin.
Dec 01 09:47:45 compute-2 systemd[1]: Started Session 22 of User ceph-admin.
Dec 01 09:47:45 compute-2 sshd-session[72743]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:45 compute-2 sshd-session[72761]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:45 compute-2 sudo[72768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:47:45 compute-2 sudo[72768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:45 compute-2 sudo[72768]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:46 compute-2 sshd-session[72793]: Accepted publickey for ceph-admin from 192.168.122.100 port 56494 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:46 compute-2 systemd-logind[795]: New session 23 of user ceph-admin.
Dec 01 09:47:46 compute-2 systemd[1]: Started Session 23 of User ceph-admin.
Dec 01 09:47:46 compute-2 sshd-session[72793]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:46 compute-2 sudo[72797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Dec 01 09:47:46 compute-2 sudo[72797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:46 compute-2 sudo[72797]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:46 compute-2 sshd-session[72822]: Accepted publickey for ceph-admin from 192.168.122.100 port 56498 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:46 compute-2 systemd-logind[795]: New session 24 of user ceph-admin.
Dec 01 09:47:46 compute-2 systemd[1]: Started Session 24 of User ceph-admin.
Dec 01 09:47:46 compute-2 sshd-session[72822]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:46 compute-2 sudo[72826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Dec 01 09:47:46 compute-2 sudo[72826]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:46 compute-2 sudo[72826]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:46 compute-2 sshd-session[72851]: Accepted publickey for ceph-admin from 192.168.122.100 port 56512 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:46 compute-2 systemd-logind[795]: New session 25 of user ceph-admin.
Dec 01 09:47:46 compute-2 systemd[1]: Started Session 25 of User ceph-admin.
Dec 01 09:47:46 compute-2 sshd-session[72851]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:46 compute-2 sudo[72855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:47:46 compute-2 sudo[72855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:46 compute-2 sudo[72855]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:47 compute-2 sshd-session[72880]: Accepted publickey for ceph-admin from 192.168.122.100 port 56514 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:47 compute-2 systemd-logind[795]: New session 26 of user ceph-admin.
Dec 01 09:47:47 compute-2 systemd[1]: Started Session 26 of User ceph-admin.
Dec 01 09:47:47 compute-2 sshd-session[72880]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:47 compute-2 sudo[72884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:47:47 compute-2 sudo[72884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:47 compute-2 sudo[72884]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:47 compute-2 sshd-session[72909]: Accepted publickey for ceph-admin from 192.168.122.100 port 56520 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:47 compute-2 systemd-logind[795]: New session 27 of user ceph-admin.
Dec 01 09:47:47 compute-2 systemd[1]: Started Session 27 of User ceph-admin.
Dec 01 09:47:47 compute-2 sshd-session[72909]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:47 compute-2 sudo[72913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Dec 01 09:47:47 compute-2 sudo[72913]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:47 compute-2 sudo[72913]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:47 compute-2 sshd-session[72938]: Accepted publickey for ceph-admin from 192.168.122.100 port 56532 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:47 compute-2 systemd-logind[795]: New session 28 of user ceph-admin.
Dec 01 09:47:47 compute-2 systemd[1]: Started Session 28 of User ceph-admin.
Dec 01 09:47:47 compute-2 sshd-session[72938]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:47 compute-2 sudo[72942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:47:47 compute-2 sudo[72942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:47 compute-2 sudo[72942]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:48 compute-2 sshd-session[72967]: Accepted publickey for ceph-admin from 192.168.122.100 port 56538 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:48 compute-2 systemd-logind[795]: New session 29 of user ceph-admin.
Dec 01 09:47:48 compute-2 systemd[1]: Started Session 29 of User ceph-admin.
Dec 01 09:47:48 compute-2 sshd-session[72967]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:48 compute-2 sudo[72971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Dec 01 09:47:48 compute-2 sudo[72971]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:48 compute-2 sudo[72971]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:48 compute-2 sshd-session[72998]: Accepted publickey for ceph-admin from 192.168.122.100 port 56546 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:48 compute-2 systemd-logind[795]: New session 30 of user ceph-admin.
Dec 01 09:47:48 compute-2 systemd[1]: Started Session 30 of User ceph-admin.
Dec 01 09:47:48 compute-2 sshd-session[72998]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:49 compute-2 sshd-session[72996]: Received disconnect from 102.213.183.66 port 57694:11: Bye Bye [preauth]
Dec 01 09:47:49 compute-2 sshd-session[72996]: Disconnected from authenticating user root 102.213.183.66 port 57694 [preauth]
Dec 01 09:47:49 compute-2 sshd-session[73025]: Accepted publickey for ceph-admin from 192.168.122.100 port 56558 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:49 compute-2 systemd-logind[795]: New session 31 of user ceph-admin.
Dec 01 09:47:49 compute-2 systemd[1]: Started Session 31 of User ceph-admin.
Dec 01 09:47:49 compute-2 sshd-session[73025]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:49 compute-2 sudo[73029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Dec 01 09:47:49 compute-2 sudo[73029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:49 compute-2 sudo[73029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:47:50 compute-2 sshd-session[73054]: Accepted publickey for ceph-admin from 192.168.122.100 port 56560 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:47:50 compute-2 systemd-logind[795]: New session 32 of user ceph-admin.
Dec 01 09:47:50 compute-2 systemd[1]: Started Session 32 of User ceph-admin.
Dec 01 09:47:50 compute-2 sshd-session[73054]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:47:50 compute-2 sudo[73058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Dec 01 09:47:50 compute-2 sudo[73058]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:47:50 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:47:50 compute-2 sudo[73058]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:27 compute-2 sudo[73102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:48:27 compute-2 sudo[73102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:27 compute-2 sudo[73102]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:27 compute-2 sudo[73127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:48:27 compute-2 sudo[73127]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:27 compute-2 sudo[73127]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:27 compute-2 sudo[73152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 01 09:48:27 compute-2 sudo[73152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:27 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:27 compute-2 sudo[73152]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:27 compute-2 sudo[73198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:48:27 compute-2 sudo[73198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:27 compute-2 sudo[73198]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:27 compute-2 sudo[73223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 09:48:27 compute-2 sudo[73223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:27 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:28 compute-2 sudo[73223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:28 compute-2 sudo[73284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:48:28 compute-2 sudo[73284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:28 compute-2 sudo[73284]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:28 compute-2 sudo[73309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:48:28 compute-2 sudo[73309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:28 compute-2 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73346 (sysctl)
Dec 01 09:48:28 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:28 compute-2 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 01 09:48:28 compute-2 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 01 09:48:28 compute-2 sudo[73309]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:28 compute-2 sudo[73368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:48:28 compute-2 sudo[73368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:28 compute-2 sudo[73368]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:28 compute-2 sudo[73393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 01 09:48:28 compute-2 sudo[73393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:29 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:29 compute-2 sudo[73393]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:29 compute-2 sudo[73437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:48:29 compute-2 sudo[73437]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:29 compute-2 sudo[73437]: pam_unix(sudo:session): session closed for user root
Dec 01 09:48:29 compute-2 sudo[73462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a -- inventory --format=json-pretty --filter-for-batch
Dec 01 09:48:29 compute-2 sudo[73462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:48:29 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:29 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:48:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat1753270501-lower\x2dmapped.mount: Deactivated successfully.
Dec 01 09:48:49 compute-2 sshd-session[73583]: Invalid user testuser from 45.78.219.119 port 40822
Dec 01 09:48:50 compute-2 sshd-session[73583]: Received disconnect from 45.78.219.119 port 40822:11: Bye Bye [preauth]
Dec 01 09:48:50 compute-2 sshd-session[73583]: Disconnected from invalid user testuser 45.78.219.119 port 40822 [preauth]
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.15665345 +0000 UTC m=+41.456780570 container create e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec 01 09:49:11 compute-2 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 01 09:49:11 compute-2 systemd[1]: Started libpod-conmon-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope.
Dec 01 09:49:11 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.137930932 +0000 UTC m=+41.438058082 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.24278949 +0000 UTC m=+41.542916630 container init e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.249681479 +0000 UTC m=+41.549808599 container start e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.253472202 +0000 UTC m=+41.553599322 container attach e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:49:11 compute-2 priceless_almeida[73587]: 167 167
Dec 01 09:49:11 compute-2 systemd[1]: libpod-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope: Deactivated successfully.
Dec 01 09:49:11 compute-2 conmon[73587]: conmon e2cea31b53ce5a5eca58 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope/container/memory.events
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.25703855 +0000 UTC m=+41.557165690 container died e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec 01 09:49:11 compute-2 systemd[1]: var-lib-containers-storage-overlay-3ebe246ea77405e4c1a38f949fb4c80ecd6237bd3634525fe3a1b19a5f867c72-merged.mount: Deactivated successfully.
Dec 01 09:49:11 compute-2 podman[73524]: 2025-12-01 09:49:11.297934402 +0000 UTC m=+41.598061522 container remove e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Dec 01 09:49:11 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:11 compute-2 systemd[1]: libpod-conmon-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope: Deactivated successfully.
Dec 01 09:49:11 compute-2 podman[73610]: 2025-12-01 09:49:11.533894114 +0000 UTC m=+0.043397965 container create 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True)
Dec 01 09:49:11 compute-2 systemd[1]: Started libpod-conmon-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope.
Dec 01 09:49:11 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:11 compute-2 podman[73610]: 2025-12-01 09:49:11.598089107 +0000 UTC m=+0.107592958 container init 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec 01 09:49:11 compute-2 podman[73610]: 2025-12-01 09:49:11.605620762 +0000 UTC m=+0.115124613 container start 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:11 compute-2 podman[73610]: 2025-12-01 09:49:11.514190221 +0000 UTC m=+0.023694102 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:11 compute-2 podman[73610]: 2025-12-01 09:49:11.609753323 +0000 UTC m=+0.119257204 container attach 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]: [
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:     {
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "available": false,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "being_replaced": false,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "ceph_device_lvm": false,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "lsm_data": {},
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "lvs": [],
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "path": "/dev/sr0",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "rejected_reasons": [
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "Has a FileSystem",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "Insufficient space (<5GB)"
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         ],
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         "sys_api": {
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "actuators": null,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "device_nodes": [
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:                 "sr0"
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             ],
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "devname": "sr0",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "human_readable_size": "482.00 KB",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "id_bus": "ata",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "model": "QEMU DVD-ROM",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "nr_requests": "2",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "parent": "/dev/sr0",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "partitions": {},
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "path": "/dev/sr0",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "removable": "1",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "rev": "2.5+",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "ro": "0",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "rotational": "1",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "sas_address": "",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "sas_device_handle": "",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "scheduler_mode": "mq-deadline",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "sectors": 0,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "sectorsize": "2048",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "size": 493568.0,
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "support_discard": "2048",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "type": "disk",
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:             "vendor": "QEMU"
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:         }
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]:     }
Dec 01 09:49:12 compute-2 relaxed_hofstadter[73627]: ]
Dec 01 09:49:12 compute-2 systemd[1]: libpod-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope: Deactivated successfully.
Dec 01 09:49:12 compute-2 podman[74704]: 2025-12-01 09:49:12.407404808 +0000 UTC m=+0.026943681 container died 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Dec 01 09:49:12 compute-2 systemd[1]: var-lib-containers-storage-overlay-3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94-merged.mount: Deactivated successfully.
Dec 01 09:49:12 compute-2 podman[74704]: 2025-12-01 09:49:12.450032224 +0000 UTC m=+0.069571067 container remove 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:12 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:12 compute-2 systemd[1]: libpod-conmon-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope: Deactivated successfully.
Dec 01 09:49:12 compute-2 sudo[73462]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:49:12 compute-2 sudo[74719]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74719]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:49:12 compute-2 sudo[74744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:49:12 compute-2 sudo[74769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74769]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:12 compute-2 sudo[74794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74794]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:49:12 compute-2 sudo[74819]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74819]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:12 compute-2 sudo[74867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:49:12 compute-2 sudo[74867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:12 compute-2 sudo[74867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[74892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:49:13 compute-2 sudo[74892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[74892]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[74917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 01 09:49:13 compute-2 sudo[74917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[74917]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[74942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:49:13 compute-2 sudo[74942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[74942]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[74967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:49:13 compute-2 sudo[74967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[74967]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[74992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:49:13 compute-2 sudo[74992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[74992]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:13 compute-2 sudo[75017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75017]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:49:13 compute-2 sudo[75042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75042]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:49:13 compute-2 sudo[75090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75090]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:49:13 compute-2 sudo[75115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75115]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:49:13 compute-2 sudo[75140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75140]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:49:13 compute-2 sudo[75165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75165]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:49:13 compute-2 sudo[75190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75190]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:49:13 compute-2 sudo[75215]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75215]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:13 compute-2 sudo[75240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75240]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:13 compute-2 sudo[75265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:49:13 compute-2 sudo[75265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:13 compute-2 sudo[75265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75313]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75338]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 01 09:49:14 compute-2 sudo[75363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75363]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:49:14 compute-2 sudo[75388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75388]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:49:14 compute-2 sudo[75413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75413]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75438]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75438]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:14 compute-2 sudo[75463]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75463]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75488]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75536]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:49:14 compute-2 sudo[75561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:49:14 compute-2 sudo[75586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75586]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:14 compute-2 sudo[75611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:14 compute-2 sudo[75611]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:14 compute-2 sudo[75636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:14 compute-2 sudo[75636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:15 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:15 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.195617962 +0000 UTC m=+0.035684906 container create 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec 01 09:49:15 compute-2 systemd[1]: Started libpod-conmon-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope.
Dec 01 09:49:15 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.265192596 +0000 UTC m=+0.105259540 container init 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.271189064 +0000 UTC m=+0.111255998 container start 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:15 compute-2 hopeful_hugle[75718]: 167 167
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.275029157 +0000 UTC m=+0.115096101 container attach 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec 01 09:49:15 compute-2 systemd[1]: libpod-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope: Deactivated successfully.
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.275627673 +0000 UTC m=+0.115694607 container died 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.180012759 +0000 UTC m=+0.020079723 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:15 compute-2 podman[75702]: 2025-12-01 09:49:15.311896351 +0000 UTC m=+0.151963285 container remove 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec 01 09:49:15 compute-2 systemd[1]: libpod-conmon-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope: Deactivated successfully.
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.376384191 +0000 UTC m=+0.042583525 container create 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:49:15 compute-2 systemd[1]: Started libpod-conmon-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope.
Dec 01 09:49:15 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:15 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:15 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:15 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:15 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.451358858 +0000 UTC m=+0.117558212 container init 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.358981785 +0000 UTC m=+0.025181139 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.457116659 +0000 UTC m=+0.123315983 container start 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.460558794 +0000 UTC m=+0.126758128 container attach 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:49:15 compute-2 systemd[1]: libpod-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope: Deactivated successfully.
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.544832219 +0000 UTC m=+0.211031553 container died 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:49:15 compute-2 podman[75735]: 2025-12-01 09:49:15.577771717 +0000 UTC m=+0.243971051 container remove 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:49:15 compute-2 systemd[1]: libpod-conmon-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope: Deactivated successfully.
Dec 01 09:49:15 compute-2 systemd[1]: Reloading.
Dec 01 09:49:15 compute-2 systemd-sysv-generator[75823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:15 compute-2 systemd-rc-local-generator[75820]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:15 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:15 compute-2 systemd[1]: Reloading.
Dec 01 09:49:15 compute-2 systemd-rc-local-generator[75857]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:15 compute-2 systemd-sysv-generator[75861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:16 compute-2 systemd[1]: Reached target All Ceph clusters and services.
Dec 01 09:49:16 compute-2 systemd[1]: Reloading.
Dec 01 09:49:16 compute-2 systemd-rc-local-generator[75895]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:16 compute-2 systemd-sysv-generator[75899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:16 compute-2 systemd[1]: Reached target Ceph cluster 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:16 compute-2 systemd[1]: Reloading.
Dec 01 09:49:16 compute-2 systemd-sysv-generator[75937]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:16 compute-2 systemd-rc-local-generator[75934]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:16 compute-2 systemd[1]: Reloading.
Dec 01 09:49:16 compute-2 systemd-rc-local-generator[75974]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:16 compute-2 systemd-sysv-generator[75978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:16 compute-2 systemd[1]: Created slice Slice /system/ceph-365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:16 compute-2 systemd[1]: Reached target System Time Set.
Dec 01 09:49:16 compute-2 systemd[1]: Reached target System Time Synchronized.
Dec 01 09:49:16 compute-2 systemd[1]: Starting Ceph mon.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:49:17 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:17 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:17 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 01 09:49:17 compute-2 podman[76033]: 2025-12-01 09:49:17.200177252 +0000 UTC m=+0.040285988 container create 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:49:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:17 compute-2 podman[76033]: 2025-12-01 09:49:17.251768666 +0000 UTC m=+0.091877422 container init 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:49:17 compute-2 podman[76033]: 2025-12-01 09:49:17.257062766 +0000 UTC m=+0.097171502 container start 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True)
Dec 01 09:49:17 compute-2 bash[76033]: 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742
Dec 01 09:49:17 compute-2 podman[76033]: 2025-12-01 09:49:17.181383572 +0000 UTC m=+0.021492328 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:17 compute-2 systemd[1]: Started Ceph mon.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:17 compute-2 ceph-mon[76053]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:49:17 compute-2 ceph-mon[76053]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: pidfile_write: ignore empty --pid-file
Dec 01 09:49:17 compute-2 ceph-mon[76053]: load: jerasure load: lrc 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Git sha 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: DB SUMMARY
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: DB Session ID:  RL9G48B0F9YTXUN1O29Q
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                                     Options.env: 0x555b622eec20
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                                Options.info_log: 0x555b63145a20
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                                 Options.wal_dir: 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                    Options.write_buffer_manager: 0x555b63149900
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                               Options.row_cache: None
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                              Options.wal_filter: None
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.wal_compression: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.max_background_jobs: 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.max_total_wal_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:       Options.compaction_readahead_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Compression algorithms supported:
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kZSTD supported: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:           Options.merge_operator: 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555b631456a0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x555b631689b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:        Options.write_buffer_size: 33554432
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:  Options.max_write_buffer_number: 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.compression: NoCompression
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1215fbd3-3ddd-4760-b4ae-013bf2430882
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557316556, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557318974, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557319113, "job": 1, "event": "recovery_finished"}
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555b6316ae00
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: DB pointer 0x555b6317a000
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:49:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Dec 01 09:49:17 compute-2 sudo[75636]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:17 compute-2 ceph-mon[76053]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Dec 01 09:49:17 compute-2 ceph-mon[76053]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(???) e0 preinit fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).mds e1 new map
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2025-12-01T09:46:50:475394+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3828223939' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: osdmap e28: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:49:17 compute-2 ceph-mon[76053]: pgmap v82: 38 pgs: 31 unknown, 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: osdmap e29: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:17 compute-2 ceph-mon[76053]: pgmap v84: 69 pgs: 32 peering, 31 unknown, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Deploying daemon mon.compute-2 on compute-2
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: osdmap e30: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 01 09:49:17 compute-2 ceph-mon[76053]: osdmap e31: 2 total, 2 up, 2 in
Dec 01 09:49:17 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:49:17 compute-2 ceph-mon[76053]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec 01 09:49:19 compute-2 ceph-mon[76053]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Dec 01 09:49:19 compute-2 ceph-mon[76053]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec 01 09:49:19 compute-2 ceph-mon[76053]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 01 09:49:19 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:19 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec 01 09:49:19 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec 01 09:49:21 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1e scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1e scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: Deploying daemon mon.compute-1 on compute-1
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-0 calling monitor election
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.17 scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.17 scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1f deep-scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1f deep-scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.14 scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.14 scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.7 scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.7 scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: pgmap v88: 100 pgs: 32 peering, 62 unknown, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-2 calling monitor election
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.16 scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.16 scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1b scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1b scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.13 scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.13 scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.8 deep-scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.8 deep-scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: pgmap v89: 100 pgs: 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.11 deep-scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.11 deep-scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1c scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 2.1c scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.c scrub starts
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 3.c scrub ok
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec 01 09:49:22 compute-2 ceph-mon[76053]: monmap epoch 2
Dec 01 09:49:22 compute-2 ceph-mon[76053]: fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:22 compute-2 ceph-mon[76053]: last_changed 2025-12-01T09:49:17.408437+0000
Dec 01 09:49:22 compute-2 ceph-mon[76053]: created 2025-12-01T09:46:48.019470+0000
Dec 01 09:49:22 compute-2 ceph-mon[76053]: min_mon_release 19 (squid)
Dec 01 09:49:22 compute-2 ceph-mon[76053]: election_strategy: 1
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 01 09:49:22 compute-2 ceph-mon[76053]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Dec 01 09:49:22 compute-2 ceph-mon[76053]: fsmap 
Dec 01 09:49:22 compute-2 ceph-mon[76053]: osdmap e31: 2 total, 2 up, 2 in
Dec 01 09:49:22 compute-2 ceph-mon[76053]: mgrmap e9: compute-0.fospow(active, since 2m)
Dec 01 09:49:22 compute-2 ceph-mon[76053]: Health detail: HEALTH_WARN 2 pool(s) do not have an application enabled
Dec 01 09:49:22 compute-2 ceph-mon[76053]: [WRN] POOL_APP_NOT_ENABLED: 2 pool(s) do not have an application enabled
Dec 01 09:49:22 compute-2 ceph-mon[76053]:     application not enabled on pool 'cephfs.cephfs.meta'
Dec 01 09:49:22 compute-2 ceph-mon[76053]:     application not enabled on pool 'cephfs.cephfs.data'
Dec 01 09:49:22 compute-2 ceph-mon[76053]:     use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:22 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:49:22 compute-2 sudo[76092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:22 compute-2 sudo[76092]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:22 compute-2 sudo[76092]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:22 compute-2 sudo[76117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:22 compute-2 sudo[76117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.278290456 +0000 UTC m=+0.039845042 container create f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec 01 09:49:23 compute-2 systemd[1]: Started libpod-conmon-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope.
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.260382896 +0000 UTC m=+0.021937512 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:23 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.372903448 +0000 UTC m=+0.134458064 container init f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.382055683 +0000 UTC m=+0.143610279 container start f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.386419902 +0000 UTC m=+0.147974518 container attach f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 09:49:23 compute-2 gracious_hofstadter[76198]: 167 167
Dec 01 09:49:23 compute-2 systemd[1]: libpod-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope: Deactivated successfully.
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.390407769 +0000 UTC m=+0.151962355 container died f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:49:23 compute-2 systemd[1]: var-lib-containers-storage-overlay-033c957c09265e4c79373b2f8904389571d5c3c0988f11613c7bb4aad23722b9-merged.mount: Deactivated successfully.
Dec 01 09:49:23 compute-2 podman[76182]: 2025-12-01 09:49:23.445818125 +0000 UTC m=+0.207372711 container remove f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:49:23 compute-2 systemd[1]: libpod-conmon-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope: Deactivated successfully.
Dec 01 09:49:23 compute-2 systemd[1]: Reloading.
Dec 01 09:49:23 compute-2 systemd-rc-local-generator[76246]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:23 compute-2 systemd-sysv-generator[76250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec 01 09:49:23 compute-2 systemd[1]: Reloading.
Dec 01 09:49:23 compute-2 systemd-rc-local-generator[76286]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:23 compute-2 systemd-sysv-generator[76290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:24 compute-2 systemd[1]: Starting Ceph mgr.compute-2.kdtkls for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:49:24 compute-2 ceph-mon[76053]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec 01 09:49:24 compute-2 ceph-mon[76053]: paxos.1).electionLogic(10) init, last seen epoch 10
Dec 01 09:49:24 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:24 compute-2 podman[76345]: 2025-12-01 09:49:24.272484107 +0000 UTC m=+0.046904087 container create 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec 01 09:49:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/lib/ceph/mgr/ceph-compute-2.kdtkls supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:24 compute-2 podman[76345]: 2025-12-01 09:49:24.335993582 +0000 UTC m=+0.110413582 container init 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:24 compute-2 podman[76345]: 2025-12-01 09:49:24.341471257 +0000 UTC m=+0.115891237 container start 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec 01 09:49:24 compute-2 bash[76345]: 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670
Dec 01 09:49:24 compute-2 podman[76345]: 2025-12-01 09:49:24.25354774 +0000 UTC m=+0.027967740 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:24 compute-2 systemd[1]: Started Ceph mgr.compute-2.kdtkls for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:24 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:24 compute-2 sudo[76117]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:24 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:24 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:25 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:27 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:27 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:27 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:28 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1f scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1f scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.2 scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.2 scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-0 calling monitor election
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-2 calling monitor election
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1e scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1e scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.5 scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.5 scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: pgmap v92: 162 pgs: 48 peering, 62 unknown, 52 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.19 deep-scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.19 deep-scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.0 scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.0 scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1b scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 3.1b scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.3 scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.3 scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: pgmap v93: 162 pgs: 48 peering, 62 unknown, 52 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.1c scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.1c scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.b scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.b scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.1d scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.1d scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.f scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2.f scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: pgmap v94: 162 pgs: 48 peering, 62 unknown, 52 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.f scrub starts
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 4.f scrub ok
Dec 01 09:49:29 compute-2 ceph-mon[76053]: monmap epoch 3
Dec 01 09:49:29 compute-2 ceph-mon[76053]: fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:29 compute-2 ceph-mon[76053]: last_changed 2025-12-01T09:49:23.596118+0000
Dec 01 09:49:29 compute-2 ceph-mon[76053]: created 2025-12-01T09:46:48.019470+0000
Dec 01 09:49:29 compute-2 ceph-mon[76053]: min_mon_release 19 (squid)
Dec 01 09:49:29 compute-2 ceph-mon[76053]: election_strategy: 1
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Dec 01 09:49:29 compute-2 ceph-mon[76053]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 01 09:49:29 compute-2 ceph-mon[76053]: fsmap 
Dec 01 09:49:29 compute-2 ceph-mon[76053]: osdmap e32: 2 total, 2 up, 2 in
Dec 01 09:49:29 compute-2 ceph-mon[76053]: mgrmap e9: compute-0.fospow(active, since 2m)
Dec 01 09:49:29 compute-2 ceph-mon[76053]: overall HEALTH_OK
Dec 01 09:49:29 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec 01 09:49:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec 01 09:49:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:30.555+0000 7f81c6b06140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:49:30 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec 01 09:49:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:30.669+0000 7f81c6b06140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:49:30 compute-2 ceph-mon[76053]: mon.compute-1 calling monitor election
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:30 compute-2 ceph-mon[76053]: 2.11 deep-scrub starts
Dec 01 09:49:30 compute-2 ceph-mon[76053]: 2.11 deep-scrub ok
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:30 compute-2 ceph-mon[76053]: osdmap e33: 2 total, 2 up, 2 in
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:30 compute-2 ceph-mon[76053]: Deploying daemon mgr.compute-1.ymizfm on compute-1
Dec 01 09:49:30 compute-2 ceph-mon[76053]: 3.8 scrub starts
Dec 01 09:49:30 compute-2 ceph-mon[76053]: 3.8 scrub ok
Dec 01 09:49:30 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Dec 01 09:49:31 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec 01 09:49:31 compute-2 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:49:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:31.644+0000 7f81c6b06140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:49:31 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec 01 09:49:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec 01 09:49:32 compute-2 ceph-mon[76053]: 2.12 scrub starts
Dec 01 09:49:32 compute-2 ceph-mon[76053]: 2.12 scrub ok
Dec 01 09:49:32 compute-2 ceph-mon[76053]: pgmap v96: 193 pgs: 48 peering, 93 unknown, 52 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:32 compute-2 ceph-mon[76053]: osdmap e34: 2 total, 2 up, 2 in
Dec 01 09:49:32 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1098136028' entity='client.admin' 
Dec 01 09:49:32 compute-2 ceph-mon[76053]: 4.3 deep-scrub starts
Dec 01 09:49:32 compute-2 ceph-mon[76053]: 4.3 deep-scrub ok
Dec 01 09:49:32 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:32 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:32 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:32 compute-2 sudo[76397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:32 compute-2 sudo[76397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:32 compute-2 sudo[76397]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:32 compute-2 sudo[76422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:32 compute-2 sudo[76422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:49:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e34 _set_new_cache_sizes cache_size:1019927439 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.345+0000 7f81c6b06140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:   from numpy import show_config as show_numpy_config
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.539+0000 7f81c6b06140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.619+0000 7f81c6b06140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec 01 09:49:32 compute-2 podman[76488]: 2025-12-01 09:49:32.627494703 +0000 UTC m=+0.041883883 container create 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec 01 09:49:32 compute-2 podman[76488]: 2025-12-01 09:49:32.609330055 +0000 UTC m=+0.023719295 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.769+0000 7f81c6b06140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:49:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:49:33 compute-2 systemd[1]: Started libpod-conmon-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope.
Dec 01 09:49:33 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:49:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:49:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:33.881+0000 7f81c6b06140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.130+0000 7f81c6b06140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.214+0000 7f81c6b06140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.287+0000 7f81c6b06140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.376+0000 7f81c6b06140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.461+0000 7f81c6b06140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec 01 09:49:34 compute-2 ceph-mon[76053]: 2.14 deep-scrub starts
Dec 01 09:49:34 compute-2 ceph-mon[76053]: 2.14 deep-scrub ok
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:34 compute-2 ceph-mon[76053]: Deploying daemon crash.compute-2 on compute-2
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:49:34 compute-2 ceph-mon[76053]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:34 compute-2 ceph-mon[76053]: Saving service ingress.rgw.default spec with placement count:2
Dec 01 09:49:34 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:34 compute-2 ceph-mon[76053]: 3.4 scrub starts
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.860+0000 7f81c6b06140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:49:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec 01 09:49:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.963+0000 7f81c6b06140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:49:35 compute-2 podman[76488]: 2025-12-01 09:49:35.041733337 +0000 UTC m=+2.456122537 container init 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:35 compute-2 podman[76488]: 2025-12-01 09:49:35.049527749 +0000 UTC m=+2.463916929 container start 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:49:35 compute-2 podman[76488]: 2025-12-01 09:49:35.0532265 +0000 UTC m=+2.467615710 container attach 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:35 compute-2 objective_leavitt[76505]: 167 167
Dec 01 09:49:35 compute-2 systemd[1]: libpod-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope: Deactivated successfully.
Dec 01 09:49:35 compute-2 podman[76488]: 2025-12-01 09:49:35.056199533 +0000 UTC m=+2.470588713 container died 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:35 compute-2 systemd[1]: var-lib-containers-storage-overlay-c7992d7ec5b0f104925cddbe42420a7497a40094cf27ee1d72845875901d1c4a-merged.mount: Deactivated successfully.
Dec 01 09:49:35 compute-2 podman[76488]: 2025-12-01 09:49:35.092699853 +0000 UTC m=+2.507089033 container remove 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:35 compute-2 systemd[1]: libpod-conmon-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope: Deactivated successfully.
Dec 01 09:49:35 compute-2 systemd[1]: Reloading.
Dec 01 09:49:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec 01 09:49:35 compute-2 systemd-sysv-generator[76554]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:35 compute-2 systemd-rc-local-generator[76550]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:35 compute-2 systemd[1]: Reloading.
Dec 01 09:49:35 compute-2 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:49:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec 01 09:49:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:35.448+0000 7f81c6b06140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:49:35 compute-2 systemd-rc-local-generator[76590]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:35 compute-2 systemd-sysv-generator[76593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e35 e35: 2 total, 2 up, 2 in
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 3.4 scrub ok
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 2.16 scrub starts
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 2.16 scrub ok
Dec 01 09:49:35 compute-2 ceph-mon[76053]: pgmap v98: 193 pgs: 48 peering, 93 unknown, 52 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 4.4 scrub starts
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 4.4 scrub ok
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 2.17 scrub starts
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 2.17 scrub ok
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 3.2 scrub starts
Dec 01 09:49:35 compute-2 ceph-mon[76053]: 3.2 scrub ok
Dec 01 09:49:35 compute-2 systemd[1]: Starting Ceph crash.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:49:35 compute-2 podman[76650]: 2025-12-01 09:49:35.88875407 +0000 UTC m=+0.036613793 container create 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec 01 09:49:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:35 compute-2 podman[76650]: 2025-12-01 09:49:35.950263716 +0000 UTC m=+0.098123469 container init 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:35 compute-2 podman[76650]: 2025-12-01 09:49:35.954960922 +0000 UTC m=+0.102820645 container start 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 01 09:49:35 compute-2 bash[76650]: 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291
Dec 01 09:49:35 compute-2 podman[76650]: 2025-12-01 09:49:35.872308485 +0000 UTC m=+0.020168228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:35 compute-2 systemd[1]: Started Ceph crash.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:36 compute-2 sudo[76422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.099+0000 7f81c6b06140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.122+0000 7f56a5321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.122+0000 7f56a5321640 -1 AuthRegistry(0x7f56a00696b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.123+0000 7f56a5321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.123+0000 7f56a5321640 -1 AuthRegistry(0x7f56a531fff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.125+0000 7f569effd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.125+0000 7f569f7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.130+0000 7f569e7fc640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.130+0000 7f56a5321640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 01 09:49:36 compute-2 sudo[76673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:36 compute-2 sudo[76673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:36 compute-2 sudo[76673]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.196+0000 7f81c6b06140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 sudo[76708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Dec 01 09:49:36 compute-2 sudo[76708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.287+0000 7f81c6b06140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.456+0000 7f81c6b06140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.537+0000 7f81c6b06140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec 01 09:49:36 compute-2 podman[76773]: 2025-12-01 09:49:36.546844998 +0000 UTC m=+0.022879124 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.706+0000 7f81c6b06140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:49:36 compute-2 podman[76773]: 2025-12-01 09:49:36.838638209 +0000 UTC m=+0.314672315 container create 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec 01 09:49:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.949+0000 7f81c6b06140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:37 compute-2 systemd[1]: Started libpod-conmon-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope.
Dec 01 09:49:37 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:37 compute-2 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:49:37 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec 01 09:49:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:37.256+0000 7f81c6b06140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:49:37 compute-2 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:49:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:37.332+0000 7f81c6b06140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:49:37 compute-2 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55ac487f2d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Dec 01 09:49:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e36 e36: 2 total, 2 up, 2 in
Dec 01 09:49:37 compute-2 podman[76773]: 2025-12-01 09:49:37.822119065 +0000 UTC m=+1.298153181 container init 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True)
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='client.14250 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: Saving service node-exporter spec with placement *
Dec 01 09:49:37 compute-2 ceph-mon[76053]: 2.18 scrub starts
Dec 01 09:49:37 compute-2 ceph-mon[76053]: 2.18 scrub ok
Dec 01 09:49:37 compute-2 ceph-mon[76053]: pgmap v99: 193 pgs: 1 active+clean+scrubbing, 192 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: Saving service grafana spec with placement compute-0;count:1
Dec 01 09:49:37 compute-2 ceph-mon[76053]: osdmap e35: 2 total, 2 up, 2 in
Dec 01 09:49:37 compute-2 ceph-mon[76053]: 4.6 scrub starts
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: Saving service prometheus spec with placement compute-0;count:1
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: Saving service alertmanager spec with placement compute-0;count:1
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:49:37 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:37 compute-2 podman[76773]: 2025-12-01 09:49:37.829808874 +0000 UTC m=+1.305842970 container start 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:49:37 compute-2 dreamy_bassi[76789]: 167 167
Dec 01 09:49:37 compute-2 podman[76773]: 2025-12-01 09:49:37.836327385 +0000 UTC m=+1.312361521 container attach 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:49:37 compute-2 systemd[1]: libpod-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope: Deactivated successfully.
Dec 01 09:49:37 compute-2 podman[76773]: 2025-12-01 09:49:37.83732375 +0000 UTC m=+1.313357866 container died 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:49:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020053086 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:37 compute-2 systemd[1]: var-lib-containers-storage-overlay-6dbb5ca4719d9ee24895d37c526425a765e8cbbe61ccbba172ec57d7bcccd824-merged.mount: Deactivated successfully.
Dec 01 09:49:37 compute-2 podman[76773]: 2025-12-01 09:49:37.879846538 +0000 UTC m=+1.355880634 container remove 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:37 compute-2 systemd[1]: libpod-conmon-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope: Deactivated successfully.
Dec 01 09:49:38 compute-2 podman[76813]: 2025-12-01 09:49:38.034170061 +0000 UTC m=+0.045049091 container create 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:49:38 compute-2 podman[76813]: 2025-12-01 09:49:38.014083196 +0000 UTC m=+0.024962256 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 2.1a scrub starts
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 2.1a scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 4.6 scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 3.1 scrub starts
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 4.1f scrub starts
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 3.1 scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 4.1f scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: pgmap v101: 193 pgs: 1 active+clean+scrubbing, 192 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:38 compute-2 ceph-mon[76053]: osdmap e36: 2 total, 2 up, 2 in
Dec 01 09:49:38 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls started
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 3.10 scrub starts
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 3.10 scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 4.2 scrub starts
Dec 01 09:49:38 compute-2 ceph-mon[76053]: 4.2 scrub ok
Dec 01 09:49:38 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3669410899' entity='client.admin' 
Dec 01 09:49:38 compute-2 systemd[1]: Started libpod-conmon-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope.
Dec 01 09:49:38 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:38 compute-2 podman[76813]: 2025-12-01 09:49:38.544222741 +0000 UTC m=+0.555101781 container init 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:38 compute-2 podman[76813]: 2025-12-01 09:49:38.553924479 +0000 UTC m=+0.564803509 container start 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:49:38 compute-2 podman[76813]: 2025-12-01 09:49:38.558311528 +0000 UTC m=+0.569190558 container attach 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 09:49:38 compute-2 pensive_fermi[76829]: --> passed data devices: 0 physical, 1 LVM
Dec 01 09:49:38 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:39 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:39 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 0eea832e-1517-4443-89c1-2611993976f8
Dec 01 09:49:39 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"} v 0)
Dec 01 09:49:39 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1836222916' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec 01 09:49:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 e37: 3 total, 2 up, 3 in
Dec 01 09:49:41 compute-2 ceph-mon[76053]: 4.15 scrub starts
Dec 01 09:49:41 compute-2 ceph-mon[76053]: 4.15 scrub ok
Dec 01 09:49:41 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm started
Dec 01 09:49:41 compute-2 ceph-mon[76053]: mgrmap e10: compute-0.fospow(active, since 2m), standbys: compute-2.kdtkls
Dec 01 09:49:41 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-2.kdtkls", "id": "compute-2.kdtkls"}]: dispatch
Dec 01 09:49:41 compute-2 ceph-mon[76053]: 3.6 scrub starts
Dec 01 09:49:41 compute-2 ceph-mon[76053]: 3.6 scrub ok
Dec 01 09:49:41 compute-2 ceph-mon[76053]: pgmap v103: 193 pgs: 1 active+clean+scrubbing, 192 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/819597' entity='client.admin' 
Dec 01 09:49:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1836222916' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec 01 09:49:41 compute-2 ceph-mon[76053]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec 01 09:49:41 compute-2 pensive_fermi[76829]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec 01 09:49:41 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 01 09:49:41 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:49:41 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:41 compute-2 lvm[76891]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:49:41 compute-2 lvm[76891]: VG ceph_vg0 finished
Dec 01 09:49:41 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.15 scrub starts
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.15 scrub ok
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.7 scrub starts
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.7 scrub ok
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 4.9 scrub starts
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 4.9 scrub ok
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]': finished
Dec 01 09:49:42 compute-2 ceph-mon[76053]: osdmap e37: 3 total, 2 up, 3 in
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 4.0 scrub starts
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 4.0 scrub ok
Dec 01 09:49:42 compute-2 ceph-mon[76053]: pgmap v105: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/88022779' entity='client.admin' 
Dec 01 09:49:42 compute-2 ceph-mon[76053]: mgrmap e11: compute-0.fospow(active, since 2m), standbys: compute-2.kdtkls, compute-1.ymizfm
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-1.ymizfm", "id": "compute-1.ymizfm"}]: dispatch
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.e deep-scrub starts
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:42 compute-2 ceph-mon[76053]: 3.e deep-scrub ok
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:42 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec 01 09:49:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec 01 09:49:42 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2854973715' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:49:42 compute-2 pensive_fermi[76829]:  stderr: got monmap epoch 3
Dec 01 09:49:42 compute-2 pensive_fermi[76829]: --> Creating keyring file for osd.2
Dec 01 09:49:42 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec 01 09:49:42 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec 01 09:49:42 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 0eea832e-1517-4443-89c1-2611993976f8 --setuser ceph --setgroup ceph
Dec 01 09:49:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054709 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:43 compute-2 ceph-mon[76053]: 4.7 scrub starts
Dec 01 09:49:43 compute-2 ceph-mon[76053]: 4.7 scrub ok
Dec 01 09:49:43 compute-2 ceph-mon[76053]: 3.f scrub starts
Dec 01 09:49:43 compute-2 ceph-mon[76053]: 3.f scrub ok
Dec 01 09:49:43 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2854973715' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 01 09:49:43 compute-2 ceph-mon[76053]: pgmap v106: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:43 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4060395120' entity='client.admin' 
Dec 01 09:49:44 compute-2 ceph-mon[76053]: 3.0 scrub starts
Dec 01 09:49:44 compute-2 ceph-mon[76053]: 3.0 scrub ok
Dec 01 09:49:44 compute-2 ceph-mon[76053]: 4.a deep-scrub starts
Dec 01 09:49:44 compute-2 ceph-mon[76053]: 4.a deep-scrub ok
Dec 01 09:49:45 compute-2 ceph-mon[76053]: 3.b scrub starts
Dec 01 09:49:45 compute-2 ceph-mon[76053]: 3.b scrub ok
Dec 01 09:49:45 compute-2 ceph-mon[76053]: 3.d scrub starts
Dec 01 09:49:45 compute-2 ceph-mon[76053]: 3.d scrub ok
Dec 01 09:49:45 compute-2 ceph-mon[76053]: pgmap v107: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:45 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1919605233' entity='client.admin' 
Dec 01 09:49:45 compute-2 sudo[77355]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmtauqbnyufolawothiylypfjsdujxyr ; /usr/bin/python3'
Dec 01 09:49:45 compute-2 sudo[77355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:49:45 compute-2 python3[77357]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:49:46 compute-2 sudo[77355]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:46 compute-2 systemd[72747]: Starting Mark boot as successful...
Dec 01 09:49:46 compute-2 systemd[72747]: Finished Mark boot as successful.
Dec 01 09:49:46 compute-2 ceph-mon[76053]: 4.b scrub starts
Dec 01 09:49:46 compute-2 ceph-mon[76053]: 4.b scrub ok
Dec 01 09:49:46 compute-2 ceph-mon[76053]: 4.8 scrub starts
Dec 01 09:49:46 compute-2 ceph-mon[76053]: 4.8 scrub ok
Dec 01 09:49:47 compute-2 pensive_fermi[76829]:  stderr: 2025-12-01T09:49:42.804+0000 7fe0a35ab740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Dec 01 09:49:47 compute-2 pensive_fermi[76829]:  stderr: 2025-12-01T09:49:43.066+0000 7fe0a35ab740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 4.17 scrub starts
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 4.17 scrub ok
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 3.a scrub starts
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 3.a scrub ok
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 4.16 scrub starts
Dec 01 09:49:47 compute-2 ceph-mon[76053]: 4.16 scrub ok
Dec 01 09:49:47 compute-2 ceph-mon[76053]: pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3251606112' entity='client.admin' 
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 01 09:49:47 compute-2 pensive_fermi[76829]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 01 09:49:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:47 compute-2 systemd[1]: libpod-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Deactivated successfully.
Dec 01 09:49:47 compute-2 systemd[1]: libpod-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Consumed 4.026s CPU time.
Dec 01 09:49:47 compute-2 podman[76813]: 2025-12-01 09:49:47.873167665 +0000 UTC m=+9.884046715 container died 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:47 compute-2 systemd[1]: var-lib-containers-storage-overlay-c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e-merged.mount: Deactivated successfully.
Dec 01 09:49:47 compute-2 podman[76813]: 2025-12-01 09:49:47.924210313 +0000 UTC m=+9.935089343 container remove 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:49:47 compute-2 systemd[1]: libpod-conmon-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Deactivated successfully.
Dec 01 09:49:47 compute-2 sudo[76708]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:48 compute-2 sudo[77860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:48 compute-2 sudo[77860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:48 compute-2 sudo[77860]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:48 compute-2 sudo[77885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a -- lvm list --format json
Dec 01 09:49:48 compute-2 sudo[77885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:48 compute-2 podman[77950]: 2025-12-01 09:49:48.447286153 +0000 UTC m=+0.021509781 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:49 compute-2 podman[77950]: 2025-12-01 09:49:49.946212672 +0000 UTC m=+1.520436290 container create 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec 01 09:49:49 compute-2 ceph-mon[76053]: 3.12 scrub starts
Dec 01 09:49:49 compute-2 ceph-mon[76053]: 3.12 scrub ok
Dec 01 09:49:49 compute-2 ceph-mon[76053]: 4.13 scrub starts
Dec 01 09:49:49 compute-2 ceph-mon[76053]: 4.13 scrub ok
Dec 01 09:49:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec 01 09:49:49 compute-2 ceph-mon[76053]: 4.14 deep-scrub starts
Dec 01 09:49:50 compute-2 systemd[1]: Started libpod-conmon-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope.
Dec 01 09:49:50 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:50 compute-2 podman[77950]: 2025-12-01 09:49:50.052404899 +0000 UTC m=+1.626628507 container init 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec 01 09:49:50 compute-2 podman[77950]: 2025-12-01 09:49:50.059559305 +0000 UTC m=+1.633782903 container start 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec 01 09:49:50 compute-2 bold_franklin[77967]: 167 167
Dec 01 09:49:50 compute-2 systemd[1]: libpod-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope: Deactivated successfully.
Dec 01 09:49:50 compute-2 podman[77950]: 2025-12-01 09:49:50.067929561 +0000 UTC m=+1.642153179 container attach 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:50 compute-2 podman[77950]: 2025-12-01 09:49:50.068271719 +0000 UTC m=+1.642495317 container died 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-3a2ee4b10b2f4642a83e4894c4bcb42e99d9ee0b04c393f254ca8c56bd06694d-merged.mount: Deactivated successfully.
Dec 01 09:49:50 compute-2 podman[77950]: 2025-12-01 09:49:50.11941551 +0000 UTC m=+1.693639108 container remove 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec 01 09:49:50 compute-2 systemd[1]: libpod-conmon-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope: Deactivated successfully.
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.264104715 +0000 UTC m=+0.041778200 container create a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec 01 09:49:50 compute-2 systemd[1]: Started libpod-conmon-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope.
Dec 01 09:49:50 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.244507412 +0000 UTC m=+0.022180917 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.356082722 +0000 UTC m=+0.133756207 container init a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.362334646 +0000 UTC m=+0.140008121 container start a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.372667181 +0000 UTC m=+0.150340676 container attach a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:49:50 compute-2 quirky_cray[78006]: {
Dec 01 09:49:50 compute-2 quirky_cray[78006]:     "2": [
Dec 01 09:49:50 compute-2 quirky_cray[78006]:         {
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "devices": [
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "/dev/loop3"
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             ],
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "lv_name": "ceph_lv0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "lv_size": "21470642176",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=365f19c2-81e5-5edd-b6b4-280555214d3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=0eea832e-1517-4443-89c1-2611993976f8,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "lv_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "name": "ceph_lv0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "tags": {
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.block_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.cluster_fsid": "365f19c2-81e5-5edd-b6b4-280555214d3a",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.cluster_name": "ceph",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.crush_device_class": "",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.encrypted": "0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.osd_fsid": "0eea832e-1517-4443-89c1-2611993976f8",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.osd_id": "2",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.type": "block",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.vdo": "0",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:                 "ceph.with_tpm": "0"
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             },
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "type": "block",
Dec 01 09:49:50 compute-2 quirky_cray[78006]:             "vg_name": "ceph_vg0"
Dec 01 09:49:50 compute-2 quirky_cray[78006]:         }
Dec 01 09:49:50 compute-2 quirky_cray[78006]:     ]
Dec 01 09:49:50 compute-2 quirky_cray[78006]: }
Dec 01 09:49:50 compute-2 systemd[1]: libpod-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope: Deactivated successfully.
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.652968478 +0000 UTC m=+0.430641953 container died a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec 01 09:49:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f-merged.mount: Deactivated successfully.
Dec 01 09:49:50 compute-2 podman[77989]: 2025-12-01 09:49:50.693693882 +0000 UTC m=+0.471367357 container remove a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:49:50 compute-2 systemd[1]: libpod-conmon-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope: Deactivated successfully.
Dec 01 09:49:50 compute-2 sudo[77885]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:50 compute-2 sudo[78027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:50 compute-2 sudo[78027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:50 compute-2 sudo[78027]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:50 compute-2 sudo[78052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:49:50 compute-2 sudo[78052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.14 deep-scrub ok
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.d scrub starts
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.d scrub ok
Dec 01 09:49:50 compute-2 ceph-mon[76053]: pgmap v109: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.12 scrub starts
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.12 scrub ok
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 3.3 scrub starts
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 3.3 scrub ok
Dec 01 09:49:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec 01 09:49:50 compute-2 ceph-mon[76053]: mgrmap e12: compute-0.fospow(active, since 2m), standbys: compute-2.kdtkls, compute-1.ymizfm
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.11 deep-scrub starts
Dec 01 09:49:50 compute-2 ceph-mon[76053]: 4.11 deep-scrub ok
Dec 01 09:49:50 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec 01 09:49:50 compute-2 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.25321242 +0000 UTC m=+0.037114975 container create 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:49:51 compute-2 systemd[1]: Started libpod-conmon-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope.
Dec 01 09:49:51 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.328324281 +0000 UTC m=+0.112226866 container init 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True)
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.237075963 +0000 UTC m=+0.020978538 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.334539435 +0000 UTC m=+0.118441990 container start 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.337325653 +0000 UTC m=+0.121228208 container attach 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 01 09:49:51 compute-2 silly_swanson[78133]: 167 167
Dec 01 09:49:51 compute-2 systemd[1]: libpod-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope: Deactivated successfully.
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.340091101 +0000 UTC m=+0.123993666 container died 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:49:51 compute-2 systemd[1]: var-lib-containers-storage-overlay-846f9a4c5b8145f1d25d9d6d97da5a6c2d628285e477c97ad38bbce9920adddc-merged.mount: Deactivated successfully.
Dec 01 09:49:51 compute-2 podman[78116]: 2025-12-01 09:49:51.371819353 +0000 UTC m=+0.155721908 container remove 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:49:51 compute-2 systemd[1]: libpod-conmon-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope: Deactivated successfully.
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.59932582 +0000 UTC m=+0.041808461 container create e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:49:51 compute-2 systemd[1]: Started libpod-conmon-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope.
Dec 01 09:49:51 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.580279 +0000 UTC m=+0.022761661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.768984361 +0000 UTC m=+0.211467062 container init e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.777122111 +0000 UTC m=+0.219604762 container start e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.783125399 +0000 UTC m=+0.225608060 container attach e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1)
Dec 01 09:49:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec 01 09:49:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]:                             [--no-systemd] [--no-tmpfs]
Dec 01 09:49:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 01 09:49:51 compute-2 ceph-mon[76053]: 4.e deep-scrub starts
Dec 01 09:49:51 compute-2 ceph-mon[76053]: 4.e deep-scrub ok
Dec 01 09:49:51 compute-2 ceph-mon[76053]: pgmap v110: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:49:51 compute-2 ceph-mon[76053]: Deploying daemon osd.2 on compute-2
Dec 01 09:49:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec 01 09:49:51 compute-2 ceph-mon[76053]: 4.10 scrub starts
Dec 01 09:49:51 compute-2 ceph-mon[76053]: 4.10 scrub ok
Dec 01 09:49:51 compute-2 systemd[1]: libpod-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope: Deactivated successfully.
Dec 01 09:49:51 compute-2 podman[78163]: 2025-12-01 09:49:51.993169385 +0000 UTC m=+0.435652036 container died e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 01 09:49:52 compute-2 systemd[1]: var-lib-containers-storage-overlay-a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26-merged.mount: Deactivated successfully.
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec 01 09:49:52 compute-2 podman[78163]: 2025-12-01 09:49:52.03152077 +0000 UTC m=+0.474003421 container remove e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec 01 09:49:52 compute-2 systemd[1]: libpod-conmon-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec 01 09:49:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec 01 09:49:52 compute-2 sshd-session[72854]: Connection closed by 192.168.122.100 port 56512
Dec 01 09:49:52 compute-2 sshd-session[73028]: Connection closed by 192.168.122.100 port 56558
Dec 01 09:49:52 compute-2 sshd-session[73057]: Connection closed by 192.168.122.100 port 56560
Dec 01 09:49:52 compute-2 sshd-session[72767]: Connection closed by 192.168.122.100 port 56492
Dec 01 09:49:52 compute-2 sshd-session[72766]: Connection closed by 192.168.122.100 port 56480
Dec 01 09:49:52 compute-2 sshd-session[72970]: Connection closed by 192.168.122.100 port 56538
Dec 01 09:49:52 compute-2 sshd-session[73001]: Connection closed by 192.168.122.100 port 56546
Dec 01 09:49:52 compute-2 sshd-session[72941]: Connection closed by 192.168.122.100 port 56532
Dec 01 09:49:52 compute-2 sshd-session[72912]: Connection closed by 192.168.122.100 port 56520
Dec 01 09:49:52 compute-2 sshd-session[72825]: Connection closed by 192.168.122.100 port 56498
Dec 01 09:49:52 compute-2 sshd-session[72883]: Connection closed by 192.168.122.100 port 56514
Dec 01 09:49:52 compute-2 sshd-session[72796]: Connection closed by 192.168.122.100 port 56494
Dec 01 09:49:52 compute-2 sshd-session[72743]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72761]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[73025]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[73054]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72998]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72822]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72909]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72967]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72880]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 sshd-session[72851]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 systemd[1]: session-31.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 sshd-session[72938]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 systemd[1]: session-30.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-22.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-25.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-20.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 31 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd[1]: session-24.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-27.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-26.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-28.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd[1]: session-29.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 30 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 25 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 20 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 32 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 28 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 24 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 27 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 29 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 22 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 26 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 31.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 30.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 22.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 25.
Dec 01 09:49:52 compute-2 sshd-session[72793]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:49:52 compute-2 systemd[1]: session-23.scope: Deactivated successfully.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 20.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Session 23 logged out. Waiting for processes to exit.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 24.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 27.
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 26.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 28.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 29.
Dec 01 09:49:52 compute-2 systemd-logind[795]: Removed session 23.
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec 01 09:49:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:52.283+0000 7f4327e77140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:49:52 compute-2 systemd[1]: Reloading.
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:49:52 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec 01 09:49:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:52.366+0000 7f4327e77140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:49:52 compute-2 systemd-rc-local-generator[78261]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:52 compute-2 systemd-sysv-generator[78264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:52 compute-2 systemd[1]: Reloading.
Dec 01 09:49:52 compute-2 systemd-sysv-generator[78304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:49:52 compute-2 systemd-rc-local-generator[78301]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:49:52 compute-2 systemd[1]: Starting Ceph osd.2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:49:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:53 compute-2 podman[78371]: 2025-12-01 09:49:53.046253027 +0000 UTC m=+0.060483912 container create decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:49:53 compute-2 ceph-mon[76053]: 4.5 deep-scrub starts
Dec 01 09:49:53 compute-2 ceph-mon[76053]: 4.5 deep-scrub ok
Dec 01 09:49:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec 01 09:49:53 compute-2 ceph-mon[76053]: mgrmap e13: compute-0.fospow(active, since 2m), standbys: compute-2.kdtkls, compute-1.ymizfm
Dec 01 09:49:53 compute-2 ceph-mon[76053]: 3.18 scrub starts
Dec 01 09:49:53 compute-2 ceph-mon[76053]: 3.18 scrub ok
Dec 01 09:49:53 compute-2 podman[78371]: 2025-12-01 09:49:53.010196398 +0000 UTC m=+0.024427313 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:53 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:49:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:53 compute-2 podman[78371]: 2025-12-01 09:49:53.145074452 +0000 UTC m=+0.159305357 container init decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:49:53 compute-2 podman[78371]: 2025-12-01 09:49:53.151923871 +0000 UTC m=+0.166154756 container start decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec 01 09:49:53 compute-2 podman[78371]: 2025-12-01 09:49:53.155875348 +0000 UTC m=+0.170106233 container attach decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:49:53 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec 01 09:49:53 compute-2 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:49:53 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:53.267+0000 7f4327e77140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 lvm[78468]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:49:53 compute-2 lvm[78468]: VG ceph_vg0 finished
Dec 01 09:49:53 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 bash[78371]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 01 09:49:53 compute-2 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:53 compute-2 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.013+0000 7f4327e77140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 01 09:49:54 compute-2 ceph-mon[76053]: 3.5 deep-scrub starts
Dec 01 09:49:54 compute-2 ceph-mon[76053]: 3.5 deep-scrub ok
Dec 01 09:49:54 compute-2 ceph-mon[76053]: 4.1e scrub starts
Dec 01 09:49:54 compute-2 ceph-mon[76053]: 4.1e scrub ok
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:   from numpy import show_config as show_numpy_config
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.199+0000 7f4327e77140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.274+0000 7f4327e77140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:54 compute-2 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 01 09:49:54 compute-2 bash[78371]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.420+0000 7f4327e77140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:49:54 compute-2 systemd[1]: libpod-decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a.scope: Deactivated successfully.
Dec 01 09:49:54 compute-2 systemd[1]: libpod-decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a.scope: Consumed 1.443s CPU time.
Dec 01 09:49:54 compute-2 podman[78371]: 2025-12-01 09:49:54.450168824 +0000 UTC m=+1.464399709 container died decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:54 compute-2 systemd[1]: var-lib-containers-storage-overlay-c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9-merged.mount: Deactivated successfully.
Dec 01 09:49:54 compute-2 podman[78371]: 2025-12-01 09:49:54.490610681 +0000 UTC m=+1.504841566 container remove decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:49:54 compute-2 podman[78624]: 2025-12-01 09:49:54.669633253 +0000 UTC m=+0.037831864 container create 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec 01 09:49:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 01 09:49:54 compute-2 podman[78624]: 2025-12-01 09:49:54.726901413 +0000 UTC m=+0.095100054 container init 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:49:54 compute-2 podman[78624]: 2025-12-01 09:49:54.731854256 +0000 UTC m=+0.100052877 container start 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:54 compute-2 bash[78624]: 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c
Dec 01 09:49:54 compute-2 podman[78624]: 2025-12-01 09:49:54.65248493 +0000 UTC m=+0.020683571 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:49:54 compute-2 systemd[1]: Started Ceph osd.2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:49:54 compute-2 ceph-osd[78644]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:49:54 compute-2 ceph-osd[78644]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec 01 09:49:54 compute-2 ceph-osd[78644]: pidfile_write: ignore empty --pid-file
Dec 01 09:49:54 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:54 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:54 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:54 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:54 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:54 compute-2 sudo[78052]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:54 compute-2 systemd[1]: session-32.scope: Deactivated successfully.
Dec 01 09:49:54 compute-2 systemd[1]: session-32.scope: Consumed 1min 22.549s CPU time.
Dec 01 09:49:54 compute-2 systemd-logind[795]: Removed session 32.
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec 01 09:49:54 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec 01 09:49:55 compute-2 ceph-mon[76053]: 4.1 deep-scrub starts
Dec 01 09:49:55 compute-2 ceph-mon[76053]: 4.1 deep-scrub ok
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:49:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.535+0000 7f4327e77140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:49:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.771+0000 7f4327e77140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.860+0000 7f4327e77140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:49:55 compute-2 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.947+0000 7f4327e77140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:49:55 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.043+0000 7f4327e77140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.128+0000 7f4327e77140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:56 compute-2 ceph-mon[76053]: 2.19 scrub starts
Dec 01 09:49:56 compute-2 ceph-mon[76053]: 2.19 scrub ok
Dec 01 09:49:56 compute-2 ceph-mon[76053]: 3.9 scrub starts
Dec 01 09:49:56 compute-2 ceph-mon[76053]: 3.9 scrub ok
Dec 01 09:49:56 compute-2 ceph-osd[78644]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec 01 09:49:56 compute-2 ceph-osd[78644]: load: jerasure load: lrc 
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.514+0000 7f4327e77140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec 01 09:49:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.614+0000 7f4327e77140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:56 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec 01 09:49:56 compute-2 ceph-osd[78644]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 01 09:49:56 compute-2 ceph-osd[78644]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:56 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount shared_bdev_used = 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.081+0000 7f4327e77140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Git sha 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DB SUMMARY
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DB Session ID:  TAXZ38CZ4XC0ICB4FDTB
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                     Options.env: 0x5636542f5650
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                Options.info_log: 0x56365511d4a0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.write_buffer_manager: 0x563655210a00
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.row_cache: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.wal_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.wal_compression: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Compression algorithms supported:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZSTD supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 34cdeff5-08c1-4205-95b7-d3ec63b89ea7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597121410, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597121761, "job": 1, "event": "recovery_finished"}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: freelist init
Dec 01 09:49:57 compute-2 ceph-osd[78644]: freelist _read_cfg
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs umount
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) close
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluefs mount shared_bdev_used = 4718592
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: RocksDB version: 7.9.2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Git sha 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DB SUMMARY
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DB Session ID:  TAXZ38CZ4XC0ICB4FDTA
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: CURRENT file:  CURRENT
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: IDENTITY file:  IDENTITY
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.error_if_exists: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.create_if_missing: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.paranoid_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                     Options.env: 0x5636542f5110
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                Options.info_log: 0x56365511d640
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_file_opening_threads: 16
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.statistics: (nil)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.use_fsync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.max_log_file_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.allow_fallocate: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.use_direct_reads: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.create_missing_column_families: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.db_log_dir: 
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                                 Options.wal_dir: db.wal
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.advise_random_on_open: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.write_buffer_manager: 0x563655210a00
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                            Options.rate_limiter: (nil)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.unordered_write: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.row_cache: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                              Options.wal_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.allow_ingest_behind: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.two_write_queues: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.manual_wal_flush: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.wal_compression: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.atomic_flush: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.log_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.allow_data_in_errors: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.db_host_id: __hostname__
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_background_jobs: 4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_background_compactions: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_subcompactions: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.max_open_files: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.max_background_flushes: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Compression algorithms supported:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZSTD supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kXpressCompression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kBZip2Compression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kLZ4Compression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kZlibCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kLZ4HCCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         kSnappyCompression supported: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563654337350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5636543369b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 34cdeff5-08c1-4205-95b7-d3ec63b89ea7
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597411451, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597486306, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:49:57 compute-2 ceph-mon[76053]: 3.1a scrub starts
Dec 01 09:49:57 compute-2 ceph-mon[76053]: 3.1a scrub ok
Dec 01 09:49:57 compute-2 ceph-mon[76053]: 2.15 scrub starts
Dec 01 09:49:57 compute-2 ceph-mon[76053]: 2.15 scrub ok
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597506126, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597512644, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597518743, "job": 1, "event": "recovery_finished"}
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56365547a000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: DB pointer 0x56365545a000
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec 01 09:49:57 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:49:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.2 total, 0.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:49:57 compute-2 ceph-osd[78644]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 01 09:49:57 compute-2 ceph-osd[78644]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 01 09:49:57 compute-2 ceph-osd[78644]: _get_class not permitted to load lua
Dec 01 09:49:57 compute-2 ceph-osd[78644]: _get_class not permitted to load sdk
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 load_pgs
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 load_pgs opened 0 pgs
Dec 01 09:49:57 compute-2 ceph-osd[78644]: osd.2 0 log_to_monitors true
Dec 01 09:49:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:49:57.569+0000 7f875f458740 -1 osd.2 0 log_to_monitors true
Dec 01 09:49:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Dec 01 09:49:57 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.705+0000 7f4327e77140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.778+0000 7f4327e77140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:49:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec 01 09:49:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.870+0000 7f4327e77140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:49:57 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.037+0000 7f4327e77140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.118+0000 7f4327e77140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.302+0000 7f4327e77140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:49:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e38 e38: 3 total, 2 up, 3 in
Dec 01 09:49:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Dec 01 09:49:58 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: 3.1c scrub starts
Dec 01 09:49:58 compute-2 ceph-mon[76053]: 3.1c scrub ok
Dec 01 09:49:58 compute-2 ceph-mon[76053]: 2.e scrub starts
Dec 01 09:49:58 compute-2 ceph-mon[76053]: 2.e scrub ok
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec 01 09:49:58 compute-2 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: osdmap e38: 3 total, 2 up, 3 in
Dec 01 09:49:58 compute-2 ceph-mon[76053]: mgrmap e14: compute-0.fospow(active, starting, since 0.0399499s), standbys: compute-2.kdtkls, compute-1.ymizfm
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-0.fospow", "id": "compute-0.fospow"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-2.kdtkls", "id": "compute-2.kdtkls"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-1.ymizfm", "id": "compute-1.ymizfm"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm restarted
Dec 01 09:49:58 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm started
Dec 01 09:49:58 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec 01 09:49:58 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 01 09:49:58 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.563+0000 7f4327e77140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec 01 09:49:58 compute-2 sshd-session[79093]: Accepted publickey for ceph-admin from 192.168.122.100 port 50904 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:49:58 compute-2 systemd-logind[795]: New session 33 of user ceph-admin.
Dec 01 09:49:58 compute-2 systemd[1]: Started Session 33 of User ceph-admin.
Dec 01 09:49:58 compute-2 sshd-session[79093]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.874+0000 7f4327e77140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 sudo[79097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:49:58 compute-2 sudo[79097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:58 compute-2 sudo[79097]: pam_unix(sudo:session): session closed for user root
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.957+0000 7f4327e77140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec 01 09:49:58 compute-2 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55f566ae1860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec 01 09:49:58 compute-2 sudo[79122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 09:49:58 compute-2 sudo[79122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:49:59 compute-2 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec 01 09:49:59 compute-2 podman[79231]: 2025-12-01 09:49:59.542496487 +0000 UTC m=+0.065495806 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:49:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 done with init, starting boot process
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 start_boot
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 01 09:49:59 compute-2 ceph-osd[78644]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec 01 09:49:59 compute-2 podman[79231]: 2025-12-01 09:49:59.650705573 +0000 UTC m=+0.173704872 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:49:59 compute-2 ceph-mon[76053]: 4.18 deep-scrub starts
Dec 01 09:49:59 compute-2 ceph-mon[76053]: 4.18 deep-scrub ok
Dec 01 09:49:59 compute-2 ceph-mon[76053]: 2.13 scrub starts
Dec 01 09:49:59 compute-2 ceph-mon[76053]: 2.13 scrub ok
Dec 01 09:49:59 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls restarted
Dec 01 09:49:59 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls started
Dec 01 09:50:00 compute-2 sudo[79122]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:00 compute-2 sudo[79320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:00 compute-2 sudo[79320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:00 compute-2 sudo[79320]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:00 compute-2 sudo[79345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:50:00 compute-2 sudo[79345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:00 compute-2 sudo[79345]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:00 compute-2 ceph-mon[76053]: purged_snaps scrub starts
Dec 01 09:50:00 compute-2 ceph-mon[76053]: purged_snaps scrub ok
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 4.1a deep-scrub starts
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 4.1a deep-scrub ok
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 2.d deep-scrub starts
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 2.d deep-scrub ok
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec 01 09:50:00 compute-2 ceph-mon[76053]: osdmap e39: 3 total, 2 up, 3 in
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 4.c scrub starts
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 4.c scrub ok
Dec 01 09:50:00 compute-2 ceph-mon[76053]: mgrmap e15: compute-0.fospow(active, since 1.44272s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 2.1 scrub starts
Dec 01 09:50:00 compute-2 ceph-mon[76053]: 2.1 scrub ok
Dec 01 09:50:00 compute-2 ceph-mon[76053]: [01/Dec/2025:09:49:59] ENGINE Bus STARTING
Dec 01 09:50:00 compute-2 ceph-mon[76053]: overall HEALTH_OK
Dec 01 09:50:00 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:50:00 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Bus STARTED
Dec 01 09:50:00 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Client ('192.168.122.100', 46558) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:00 compute-2 sudo[79402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:00 compute-2 sudo[79402]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:00 compute-2 sudo[79402]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:01 compute-2 sudo[79427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 01 09:50:01 compute-2 sudo[79427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:01 compute-2 sudo[79427]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:01 compute-2 ceph-mon[76053]: pgmap v5: 193 pgs: 164 active+clean, 29 unknown; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:50:01 compute-2 ceph-mon[76053]: 4.1b scrub starts
Dec 01 09:50:01 compute-2 ceph-mon[76053]: 4.1b scrub ok
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:01 compute-2 ceph-mon[76053]: 2.6 scrub starts
Dec 01 09:50:01 compute-2 ceph-mon[76053]: 2.6 scrub ok
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='client.14340 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 ceph-mon[76053]: mgrmap e16: compute-0.fospow(active, since 3s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:01 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:01 compute-2 sudo[79470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:50:01 compute-2 sudo[79470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:01 compute-2 sudo[79470]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:01 compute-2 sudo[79495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:50:01 compute-2 sudo[79495]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:01 compute-2 sudo[79495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:01 compute-2 sudo[79520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:01 compute-2 sudo[79520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:01 compute-2 sudo[79520]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:02 compute-2 sudo[79545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79570]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79618]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79643]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79643]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 01 09:50:02 compute-2 sudo[79668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79668]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:02 compute-2 sudo[79693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79693]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:02 compute-2 sudo[79719]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79719]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79744]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:02 compute-2 sudo[79769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79769]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 sudo[79794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79794]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:02 compute-2 ceph-mon[76053]: 7.1c scrub starts
Dec 01 09:50:02 compute-2 ceph-mon[76053]: 7.1c scrub ok
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Unable to set osd_memory_target on compute-0 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec 01 09:50:02 compute-2 ceph-mon[76053]: 2.9 scrub starts
Dec 01 09:50:02 compute-2 ceph-mon[76053]: 2.9 scrub ok
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec 01 09:50:02 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='client.14346 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:02 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:02 compute-2 sudo[79842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:02 compute-2 sudo[79842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:02 compute-2 sudo[79842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:03 compute-2 sudo[79867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:03 compute-2 sudo[79892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79892]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:50:03 compute-2 sudo[79917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79917]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:50:03 compute-2 sudo[79942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79942]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:03 compute-2 sudo[79967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79967]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[79992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:03 compute-2 sudo[79992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[79992]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[80017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:03 compute-2 sudo[80017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[80017]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:03 compute-2 sudo[80065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:03 compute-2 sudo[80065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:03 compute-2 sudo[80065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 ceph-mon[76053]: pgmap v6: 193 pgs: 164 active+clean, 29 unknown; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:04 compute-2 ceph-mon[76053]: 7.12 scrub starts
Dec 01 09:50:04 compute-2 ceph-mon[76053]: 7.12 scrub ok
Dec 01 09:50:04 compute-2 ceph-mon[76053]: 2.4 scrub starts
Dec 01 09:50:04 compute-2 ceph-mon[76053]: 2.4 scrub ok
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:04 compute-2 ceph-mon[76053]: from='client.14352 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:04 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:04 compute-2 ceph-mon[76053]: mgrmap e17: compute-0.fospow(active, since 4s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:04 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:04 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:04 compute-2 sudo[80090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:04 compute-2 sudo[80090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80090]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:04 compute-2 sudo[80115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80115]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:04 compute-2 sudo[80140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80140]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:04 compute-2 sudo[80165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80165]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:04 compute-2 sudo[80190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80190]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:04 compute-2 sudo[80215]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80215]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:04 compute-2 sudo[80240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80240]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:04 compute-2 sudo[80288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80288]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:04 compute-2 sudo[80313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80313]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 sudo[80338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:04 compute-2 sudo[80338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:04 compute-2 sudo[80338]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 17.586 iops: 4501.925 elapsed_sec: 0.666
Dec 01 09:50:04 compute-2 ceph-osd[78644]: log_channel(cluster) log [WRN] : OSD bench result of 4501.924530 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 0 waiting for initial osdmap
Dec 01 09:50:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:50:04.910+0000 7f875b3db640 -1 osd.2 0 waiting for initial osdmap
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 check_osdmap_features require_osd_release unknown -> squid
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 set_numa_affinity not setting numa affinity
Dec 01 09:50:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:50:04.944+0000 7f8756a03640 -1 osd.2 39 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 01 09:50:04 compute-2 ceph-osd[78644]: osd.2 39 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec 01 09:50:05 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:05 compute-2 ceph-mon[76053]: 7.17 deep-scrub starts
Dec 01 09:50:05 compute-2 ceph-mon[76053]: 7.17 deep-scrub ok
Dec 01 09:50:05 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:05 compute-2 ceph-mon[76053]: 5.19 scrub starts
Dec 01 09:50:05 compute-2 ceph-mon[76053]: 5.19 scrub ok
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='client.14358 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec 01 09:50:05 compute-2 ceph-osd[78644]: osd.2 39 tick checking mon for new map
Dec 01 09:50:06 compute-2 sshd-session[79096]: Connection closed by 192.168.122.100 port 50904
Dec 01 09:50:06 compute-2 sshd-session[79093]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:50:06 compute-2 systemd[1]: session-33.scope: Deactivated successfully.
Dec 01 09:50:06 compute-2 systemd[1]: session-33.scope: Consumed 5.606s CPU time.
Dec 01 09:50:06 compute-2 systemd-logind[795]: Session 33 logged out. Waiting for processes to exit.
Dec 01 09:50:06 compute-2 systemd-logind[795]: Removed session 33.
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec 01 09:50:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec 01 09:50:06 compute-2 ceph-mon[76053]: pgmap v7: 193 pgs: 164 active+clean, 29 unknown; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Dec 01 09:50:06 compute-2 ceph-mon[76053]: 7.15 deep-scrub starts
Dec 01 09:50:06 compute-2 ceph-mon[76053]: 7.15 deep-scrub ok
Dec 01 09:50:06 compute-2 ceph-mon[76053]: 6.18 deep-scrub starts
Dec 01 09:50:06 compute-2 ceph-mon[76053]: 6.18 deep-scrub ok
Dec 01 09:50:06 compute-2 ceph-mon[76053]: OSD bench result of 4501.924530 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 01 09:50:06 compute-2 ceph-mon[76053]: Deploying daemon node-exporter.compute-0 on compute-0
Dec 01 09:50:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec 01 09:50:06 compute-2 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 40 state: booting -> active
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1b( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.19( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1b( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1c( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1d( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.3( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.8( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.2( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.6( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.a( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.14( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.14( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.1d( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec 01 09:50:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:50:06 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec 01 09:50:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:06.927+0000 7f4857fa3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:50:07 compute-2 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:50:07 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec 01 09:50:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:07.015+0000 7f4857fa3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1e( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.1f( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1c( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1f( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.12( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.15( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.11( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.17( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.16( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.15( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.11( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.9( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.e( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.5( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.8( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.9( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1a( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1d( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:50:07 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec 01 09:50:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 7.0 deep-scrub starts
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 7.0 deep-scrub ok
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 6.1f deep-scrub starts
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 6.1f deep-scrub ok
Dec 01 09:50:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec 01 09:50:07 compute-2 ceph-mon[76053]: mgrmap e18: compute-0.fospow(active, since 8s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 7.1 scrub starts
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 7.1 scrub ok
Dec 01 09:50:07 compute-2 ceph-mon[76053]: osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015] boot
Dec 01 09:50:07 compute-2 ceph-mon[76053]: osdmap e40: 3 total, 3 up, 3 in
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 5.1d scrub starts
Dec 01 09:50:07 compute-2 ceph-mon[76053]: 5.1d scrub ok
Dec 01 09:50:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec 01 09:50:07 compute-2 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:50:07 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec 01 09:50:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:07.938+0000 7f4857fa3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:50:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.1d( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.14( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.14( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.c( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.19( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.10( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1d( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.2( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1b( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.3( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.6( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.8( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.12( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1f( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1e( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.1f( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.9( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.15( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.16( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.12( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1c( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.17( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:50:08 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec 01 09:50:08 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec 01 09:50:08 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:50:08 compute-2 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:50:08 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:50:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.712+0000 7f4857fa3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:50:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:50:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:50:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:   from numpy import show_config as show_numpy_config
Dec 01 09:50:08 compute-2 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:50:08 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec 01 09:50:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.919+0000 7f4857fa3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 7.7 scrub starts
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 7.7 scrub ok
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 6.c scrub starts
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 6.c scrub ok
Dec 01 09:50:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec 01 09:50:08 compute-2 ceph-mon[76053]: mgrmap e19: compute-0.fospow(active, since 9s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:08 compute-2 ceph-mon[76053]: osdmap e41: 3 total, 3 up, 3 in
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 7.1d scrub starts
Dec 01 09:50:08 compute-2 ceph-mon[76053]: 7.1d scrub ok
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec 01 09:50:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.999+0000 7f4857fa3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:50:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:09.164+0000 7f4857fa3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:50:09 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec 01 09:50:09 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 5.5 scrub starts
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 5.5 scrub ok
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 7.d scrub starts
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 7.d scrub ok
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 5.12 scrub starts
Dec 01 09:50:09 compute-2 ceph-mon[76053]: 5.12 scrub ok
Dec 01 09:50:09 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:50:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.402+0000 7f4857fa3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 01 09:50:10 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.670+0000 7f4857fa3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.772+0000 7f4857fa3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.859+0000 7f4857fa3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec 01 09:50:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.955+0000 7f4857fa3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 7.c scrub starts
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 7.c scrub ok
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 6.6 deep-scrub starts
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 6.6 deep-scrub ok
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 5.13 scrub starts
Dec 01 09:50:10 compute-2 ceph-mon[76053]: 5.13 scrub ok
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec 01 09:50:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.047+0000 7f4857fa3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:50:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.487+0000 7f4857fa3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec 01 09:50:11 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec 01 09:50:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.613+0000 7f4857fa3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec 01 09:50:11 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec 01 09:50:11 compute-2 ceph-mon[76053]: 7.19 scrub starts
Dec 01 09:50:11 compute-2 ceph-mon[76053]: 7.19 scrub ok
Dec 01 09:50:11 compute-2 ceph-mon[76053]: 6.4 scrub starts
Dec 01 09:50:11 compute-2 ceph-mon[76053]: 6.4 scrub ok
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec 01 09:50:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.126+0000 7f4857fa3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 01 09:50:12 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 01 09:50:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec 01 09:50:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec 01 09:50:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.851+0000 7f4857fa3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:50:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.935+0000 7f4857fa3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 7.14 scrub starts
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 7.14 scrub ok
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 7.1a deep-scrub starts
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 7.1a deep-scrub ok
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 6.0 scrub starts
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 6.0 scrub ok
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 5.8 scrub starts
Dec 01 09:50:12 compute-2 ceph-mon[76053]: 5.8 scrub ok
Dec 01 09:50:12 compute-2 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec 01 09:50:12 compute-2 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec 01 09:50:12 compute-2 ceph-mon[76053]: osdmap e42: 3 total, 3 up, 3 in
Dec 01 09:50:12 compute-2 ceph-mon[76053]: mgrmap e20: compute-0.fospow(active, starting, since 0.0343962s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec 01 09:50:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.035+0000 7f4857fa3140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec 01 09:50:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.225+0000 7f4857fa3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.313+0000 7f4857fa3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.508+0000 7f4857fa3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:50:13 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 01 09:50:13 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.774+0000 7f4857fa3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:13 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 5.10 scrub starts
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 5.10 scrub ok
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 5.3 scrub starts
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 5.3 scrub ok
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 7.a scrub starts
Dec 01 09:50:13 compute-2 ceph-mon[76053]: 7.a scrub ok
Dec 01 09:50:13 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm restarted
Dec 01 09:50:13 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm started
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.089+0000 7f4857fa3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.206+0000 7f4857fa3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55fbae6d3860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.498+0000 7fce93a4d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec 01 09:50:14 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec 01 09:50:14 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec 01 09:50:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.613+0000 7fce93a4d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:50:14 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 5.15 scrub starts
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 5.15 scrub ok
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 5.6 scrub starts
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 5.6 scrub ok
Dec 01 09:50:15 compute-2 ceph-mon[76053]: mgrmap e21: compute-0.fospow(active, starting, since 1.23559s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:15 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls restarted
Dec 01 09:50:15 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls started
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 2.10 scrub starts
Dec 01 09:50:15 compute-2 ceph-mon[76053]: 2.10 scrub ok
Dec 01 09:50:15 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec 01 09:50:15 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec 01 09:50:15 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec 01 09:50:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:15.637+0000 7fce93a4d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:50:15 compute-2 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:50:15 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 5.16 deep-scrub starts
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 5.16 deep-scrub ok
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 5.c scrub starts
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 5.c scrub ok
Dec 01 09:50:16 compute-2 ceph-mon[76053]: mgrmap e22: compute-0.fospow(active, starting, since 2s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 2.c deep-scrub starts
Dec 01 09:50:16 compute-2 ceph-mon[76053]: 2.c deep-scrub ok
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.411+0000 7fce93a4d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:50:16 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec 01 09:50:16 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:   from numpy import show_config as show_numpy_config
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.621+0000 7fce93a4d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.702+0000 7fce93a4d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec 01 09:50:16 compute-2 systemd[1]: Stopping User Manager for UID 42477...
Dec 01 09:50:16 compute-2 systemd[72747]: Activating special unit Exit the Session...
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped target Main User Target.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped target Basic System.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped target Paths.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped target Sockets.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped target Timers.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 01 09:50:16 compute-2 systemd[72747]: Closed D-Bus User Message Bus Socket.
Dec 01 09:50:16 compute-2 systemd[72747]: Stopped Create User's Volatile Files and Directories.
Dec 01 09:50:16 compute-2 systemd[72747]: Removed slice User Application Slice.
Dec 01 09:50:16 compute-2 systemd[72747]: Reached target Shutdown.
Dec 01 09:50:16 compute-2 systemd[72747]: Finished Exit the Session.
Dec 01 09:50:16 compute-2 systemd[72747]: Reached target Exit the Session.
Dec 01 09:50:16 compute-2 systemd[1]: user@42477.service: Deactivated successfully.
Dec 01 09:50:16 compute-2 systemd[1]: Stopped User Manager for UID 42477.
Dec 01 09:50:16 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec 01 09:50:16 compute-2 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec 01 09:50:16 compute-2 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec 01 09:50:16 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec 01 09:50:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.865+0000 7fce93a4d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:50:16 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:50:16 compute-2 systemd[1]: Removed slice User Slice of UID 42477.
Dec 01 09:50:16 compute-2 systemd[1]: user-42477.slice: Consumed 1min 29.431s CPU time.
Dec 01 09:50:17 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 6.15 scrub starts
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 6.15 scrub ok
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 6.f scrub starts
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 6.f scrub ok
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 5.0 scrub starts
Dec 01 09:50:17 compute-2 ceph-mon[76053]: 5.0 scrub ok
Dec 01 09:50:17 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:50:17 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1b deep-scrub starts
Dec 01 09:50:17 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1b deep-scrub ok
Dec 01 09:50:17 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec 01 09:50:17 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec 01 09:50:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.002+0000 7fce93a4d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.251+0000 7fce93a4d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.342+0000 7fce93a4d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 5.11 scrub starts
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 5.a scrub starts
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 5.11 scrub ok
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 5.a scrub ok
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 6.1b deep-scrub starts
Dec 01 09:50:18 compute-2 ceph-mon[76053]: 6.1b deep-scrub ok
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.429+0000 7fce93a4d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:50:18 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 01 09:50:18 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.527+0000 7fce93a4d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec 01 09:50:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.613+0000 7fce93a4d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:50:18 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec 01 09:50:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.009+0000 7fce93a4d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:50:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.116+0000 7fce93a4d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec 01 09:50:19 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 01 09:50:19 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 01 09:50:19 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec 01 09:50:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.586+0000 7fce93a4d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:50:19 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.9 scrub starts
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.9 scrub ok
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.a scrub starts
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.a scrub ok
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.1 scrub starts
Dec 01 09:50:19 compute-2 ceph-mon[76053]: 6.1 scrub ok
Dec 01 09:50:19 compute-2 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec 01 09:50:19 compute-2 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec 01 09:50:20 compute-2 sshd-session[80427]: Accepted publickey for ceph-admin from 192.168.122.100 port 53718 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:50:20 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Dec 01 09:50:20 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 01 09:50:20 compute-2 systemd-logind[795]: New session 34 of user ceph-admin.
Dec 01 09:50:20 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 01 09:50:20 compute-2 systemd[1]: Starting User Manager for UID 42477...
Dec 01 09:50:20 compute-2 systemd[80431]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:50:20 compute-2 systemd[80431]: Queued start job for default target Main User Target.
Dec 01 09:50:20 compute-2 systemd[80431]: Created slice User Application Slice.
Dec 01 09:50:20 compute-2 systemd[80431]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 01 09:50:20 compute-2 systemd[80431]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:50:20 compute-2 systemd[80431]: Reached target Paths.
Dec 01 09:50:20 compute-2 systemd[80431]: Reached target Timers.
Dec 01 09:50:20 compute-2 systemd[80431]: Starting D-Bus User Message Bus Socket...
Dec 01 09:50:20 compute-2 systemd[80431]: Starting Create User's Volatile Files and Directories...
Dec 01 09:50:20 compute-2 systemd[80431]: Finished Create User's Volatile Files and Directories.
Dec 01 09:50:20 compute-2 systemd[80431]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:50:20 compute-2 systemd[80431]: Reached target Sockets.
Dec 01 09:50:20 compute-2 systemd[80431]: Reached target Basic System.
Dec 01 09:50:20 compute-2 systemd[80431]: Reached target Main User Target.
Dec 01 09:50:20 compute-2 systemd[80431]: Startup finished in 126ms.
Dec 01 09:50:20 compute-2 systemd[1]: Started User Manager for UID 42477.
Dec 01 09:50:20 compute-2 systemd[1]: Started Session 34 of User ceph-admin.
Dec 01 09:50:20 compute-2 sshd-session[80427]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.233+0000 7fce93a4d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.315+0000 7fce93a4d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:50:20 compute-2 sudo[80447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:20 compute-2 sudo[80447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:20 compute-2 sudo[80447]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:20 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 01 09:50:20 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 01 09:50:20 compute-2 sudo[80472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 09:50:20 compute-2 sudo[80472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.411+0000 7fce93a4d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.588+0000 7fce93a4d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.673+0000 7fce93a4d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec 01 09:50:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.859+0000 7fce93a4d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:50:20 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:50:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e2 new map
Dec 01 09:50:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e2 print_map
                                           e2
                                           btime 2025-12-01T09:50:20:704588+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:20.704523+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Dec 01 09:50:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.b scrub starts
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.b scrub ok
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 5.9 scrub starts
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 5.9 scrub ok
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 5.d scrub starts
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 5.d scrub ok
Dec 01 09:50:21 compute-2 ceph-mon[76053]: osdmap e43: 3 total, 3 up, 3 in
Dec 01 09:50:21 compute-2 ceph-mon[76053]: mgrmap e23: compute-0.fospow(active, starting, since 0.181684s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-0.fospow", "id": "compute-0.fospow"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-1.ymizfm", "id": "compute-1.ymizfm"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-2.kdtkls", "id": "compute-2.kdtkls"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.14 scrub starts
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.14 scrub ok
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.8 deep-scrub starts
Dec 01 09:50:21 compute-2 ceph-mon[76053]: 6.8 deep-scrub ok
Dec 01 09:50:21 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm restarted
Dec 01 09:50:21 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm started
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: mgrmap e24: compute-0.fospow(active, since 1.43483s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='client.14388 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec 01 09:50:21 compute-2 ceph-mon[76053]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 01 09:50:21 compute-2 ceph-mon[76053]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 01 09:50:21 compute-2 podman[80569]: 2025-12-01 09:50:21.030237566 +0000 UTC m=+0.177310350 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec 01 09:50:21 compute-2 podman[80569]: 2025-12-01 09:50:21.150970351 +0000 UTC m=+0.298043115 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:21 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 01 09:50:21 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 01 09:50:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.328+0000 7fce93a4d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec 01 09:50:21 compute-2 sudo[80472]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.668+0000 7fce93a4d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec 01 09:50:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.759+0000 7fce93a4d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55b835b7f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec 01 09:50:21 compute-2 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec 01 09:50:21 compute-2 sudo[80668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:21 compute-2 sudo[80668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:21 compute-2 sudo[80668]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:22 compute-2 sudo[80693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:50:22 compute-2 sudo[80693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 5.1a scrub starts
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 5.1a scrub ok
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 5.17 scrub starts
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 6.7 scrub starts
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 5.17 scrub ok
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 6.7 scrub ok
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 01 09:50:22 compute-2 ceph-mon[76053]: osdmap e44: 3 total, 3 up, 3 in
Dec 01 09:50:22 compute-2 ceph-mon[76053]: fsmap cephfs:0
Dec 01 09:50:22 compute-2 ceph-mon[76053]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 3.1d scrub starts
Dec 01 09:50:22 compute-2 ceph-mon[76053]: 3.1d scrub ok
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:22 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Bus STARTING
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:50:22 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Client ('192.168.122.100', 56006) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:50:22 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls restarted
Dec 01 09:50:22 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls started
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:22 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 01 09:50:22 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 01 09:50:22 compute-2 sudo[80693]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:22 compute-2 sudo[80749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:22 compute-2 sudo[80749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:22 compute-2 sudo[80749]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:22 compute-2 sudo[80774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 01 09:50:22 compute-2 sudo[80774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:23 compute-2 sudo[80774]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 01 09:50:23 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.14 scrub starts
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.14 scrub ok
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.7 scrub starts
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.7 scrub ok
Dec 01 09:50:23 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:50:23 compute-2 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Bus STARTED
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='client.14424 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:23 compute-2 ceph-mon[76053]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.b scrub starts
Dec 01 09:50:23 compute-2 ceph-mon[76053]: 5.b scrub ok
Dec 01 09:50:23 compute-2 ceph-mon[76053]: mgrmap e25: compute-0.fospow(active, since 3s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:23 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:23 compute-2 sudo[80817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:50:23 compute-2 sudo[80817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80817]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:50:23 compute-2 sudo[80842]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80842]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:23 compute-2 sudo[80867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80867]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:23 compute-2 sudo[80892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80892]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:23 compute-2 sudo[80917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80917]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:23 compute-2 sudo[80965]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80965]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[80990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:50:23 compute-2 sudo[80990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[80990]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:23 compute-2 sudo[81015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 01 09:50:23 compute-2 sudo[81015]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:23 compute-2 sudo[81015]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:24 compute-2 sudo[81040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81040]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:24 compute-2 sudo[81065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:24 compute-2 sudo[81090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81090]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:24 compute-2 sudo[81115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81115]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:24 compute-2 sudo[81140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81140]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 01 09:50:24 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec 01 09:50:24 compute-2 sudo[81188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:24 compute-2 sudo[81188]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81188]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:50:24 compute-2 sudo[81213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81213]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:24 compute-2 sudo[81238]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81238]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.2 scrub starts
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.2 scrub ok
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.1e scrub starts
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.1e scrub ok
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.4 scrub starts
Dec 01 09:50:24 compute-2 ceph-mon[76053]: 5.4 scrub ok
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='client.14430 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 09:50:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec 01 09:50:24 compute-2 ceph-mon[76053]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:24 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:24 compute-2 ceph-mon[76053]: mgrmap e26: compute-0.fospow(active, since 4s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:24 compute-2 sudo[81263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:50:24 compute-2 sudo[81263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81263]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:50:24 compute-2 sudo[81288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81288]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:24 compute-2 sudo[81313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81313]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:24 compute-2 sudo[81338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81338]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:24 compute-2 sudo[81363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:24 compute-2 sudo[81363]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:24 compute-2 sudo[81411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81411]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81411]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81436]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81436]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:25 compute-2 sudo[81461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81461]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:25 compute-2 sudo[81486]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81486]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:50:25 compute-2 sudo[81511]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81511]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec 01 09:50:25 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec 01 09:50:25 compute-2 sudo[81536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81536]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:25 compute-2 sudo[81561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81586]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:50:25 compute-2 sudo[81659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81659]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 sudo[81684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:25 compute-2 sudo[81684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:25 compute-2 sudo[81684]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec 01 09:50:26 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 01 09:50:26 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 01 09:50:27 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.1 scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.1 scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.1b scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.1b scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.e scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.e scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec 01 09:50:32 compute-2 ceph-mon[76053]: osdmap e45: 3 total, 3 up, 3 in
Dec 01 09:50:32 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec 01 09:50:32 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.618134022s
Dec 01 09:50:32 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.618134499s
Dec 01 09:50:32 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.774113655s, txc = 0x563656a5e000
Dec 01 09:50:32 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).paxos(paxos updating c 1..460) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.615036905s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Dec 01 09:50:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2[76049]: 2025-12-01T09:50:32.028+0000 7fdd8df85640 -1 mon.compute-2@1(peon).paxos(paxos updating c 1..460) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.615036905s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Dec 01 09:50:32 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec 01 09:50:32 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.635492325s, txc = 0x5636565a4f00
Dec 01 09:50:32 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.18 scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.18 scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.1f deep-scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.1f deep-scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 6.12 scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 6.12 scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: pgmap v8: 194 pgs: 1 unknown, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.1e deep-scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 7.1e deep-scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.f scrub starts
Dec 01 09:50:32 compute-2 ceph-mon[76053]: 5.f scrub ok
Dec 01 09:50:32 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec 01 09:50:32 compute-2 ceph-mon[76053]: osdmap e46: 3 total, 3 up, 3 in
Dec 01 09:50:32 compute-2 ceph-mon[76053]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec 01 09:50:32 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:32 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec 01 09:50:32 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.5 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.5 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.6 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.6 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.1c scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.1c scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.1f scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: pgmap v10: 194 pgs: 1 creating+peering, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 0 B/s wr, 11 op/s
Dec 01 09:50:33 compute-2 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.3 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.3 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.18 deep-scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.18 deep-scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.2 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.2 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.1b scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 5.1b scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: pgmap v11: 194 pgs: 1 creating+peering, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 8 op/s
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.4 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.4 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.5 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.5 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.f scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.f scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.3 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.3 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: pgmap v12: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.8 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.8 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.2 scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.2 scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 7.1f scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.1e scrub starts
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: 6.1e scrub ok
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: osdmap e47: 3 total, 3 up, 3 in
Dec 01 09:50:33 compute-2 ceph-mon[76053]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec 01 09:50:33 compute-2 ceph-mon[76053]: mgrmap e27: compute-0.fospow(active, since 12s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:33 compute-2 ceph-mon[76053]: Deploying daemon node-exporter.compute-1 on compute-1
Dec 01 09:50:33 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec 01 09:50:34 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 6.e scrub starts
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 6.e scrub ok
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 7.e deep-scrub starts
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 7.e deep-scrub ok
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 7.16 scrub starts
Dec 01 09:50:34 compute-2 ceph-mon[76053]: 7.16 scrub ok
Dec 01 09:50:34 compute-2 ceph-mon[76053]: pgmap v14: 194 pgs: 2 active+clean+scrubbing, 192 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Dec 01 09:50:34 compute-2 sudo[81709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:34 compute-2 sudo[81709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:34 compute-2 sudo[81709]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:34 compute-2 sudo[81734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:34 compute-2 sudo[81734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:34 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 01 09:50:35 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 6.d scrub starts
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 6.d scrub ok
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 7.b deep-scrub starts
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 7.b deep-scrub ok
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 6.17 scrub starts
Dec 01 09:50:35 compute-2 ceph-mon[76053]: 6.17 scrub ok
Dec 01 09:50:35 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec 01 09:50:35 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 01 09:50:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:35 compute-2 ceph-mon[76053]: Deploying daemon node-exporter.compute-2 on compute-2
Dec 01 09:50:35 compute-2 systemd[1]: Reloading.
Dec 01 09:50:35 compute-2 systemd-rc-local-generator[81829]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:35 compute-2 systemd-sysv-generator[81832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:35 compute-2 systemd[1]: Reloading.
Dec 01 09:50:35 compute-2 systemd-rc-local-generator[81871]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:35 compute-2 systemd-sysv-generator[81875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:35 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1c deep-scrub starts
Dec 01 09:50:35 compute-2 systemd[1]: Starting Ceph node-exporter.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:50:35 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1c deep-scrub ok
Dec 01 09:50:36 compute-2 bash[81924]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 6.19 deep-scrub starts
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 6.19 deep-scrub ok
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 7.10 scrub starts
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 7.10 scrub ok
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 7.11 scrub starts
Dec 01 09:50:36 compute-2 ceph-mon[76053]: 7.11 scrub ok
Dec 01 09:50:36 compute-2 ceph-mon[76053]: pgmap v15: 194 pgs: 2 active+clean+scrubbing, 192 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec 01 09:50:36 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4074645757' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:50:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:37 compute-2 bash[81924]: Getting image source signatures
Dec 01 09:50:37 compute-2 bash[81924]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec 01 09:50:37 compute-2 bash[81924]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec 01 09:50:37 compute-2 bash[81924]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.1a scrub starts
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.1a scrub ok
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.16 scrub starts
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.16 scrub ok
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.1c deep-scrub starts
Dec 01 09:50:37 compute-2 ceph-mon[76053]: 6.1c deep-scrub ok
Dec 01 09:50:37 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/310913945' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 09:50:37 compute-2 bash[81924]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec 01 09:50:37 compute-2 bash[81924]: Writing manifest to image destination
Dec 01 09:50:38 compute-2 podman[81924]: 2025-12-01 09:50:38.03128968 +0000 UTC m=+1.752088654 container create f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:50:38 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f6c3182dc4f5e7dc0a8d50aecdd8244da1246d38d41febc348e246fd8833398/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:38 compute-2 podman[81924]: 2025-12-01 09:50:38.083170547 +0000 UTC m=+1.803969541 container init f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:50:38 compute-2 podman[81924]: 2025-12-01 09:50:38.088511052 +0000 UTC m=+1.809310026 container start f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:50:38 compute-2 bash[81924]: f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519
Dec 01 09:50:38 compute-2 podman[81924]: 2025-12-01 09:50:38.013191784 +0000 UTC m=+1.733990788 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.100Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.100Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec 01 09:50:38 compute-2 systemd[1]: Started Ceph node-exporter.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=arp
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=bcache
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=bonding
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=cpu
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=dmi
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=edac
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=entropy
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=filefd
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=hwmon
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netclass
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netdev
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netstat
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nfs
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nvme
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=os
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=pressure
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=rapl
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=selinux
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=softnet
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=stat
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=textfile
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=time
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=uname
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=xfs
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=zfs
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec 01 09:50:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 01 09:50:38 compute-2 sudo[81734]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:38 compute-2 sudo[82008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:38 compute-2 sudo[82008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:38 compute-2 sudo[82008]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:38 compute-2 sudo[82033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Dec 01 09:50:38 compute-2 sudo[82033]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:38 compute-2 ceph-mon[76053]: 6.10 scrub starts
Dec 01 09:50:38 compute-2 ceph-mon[76053]: 6.10 scrub ok
Dec 01 09:50:38 compute-2 ceph-mon[76053]: pgmap v16: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/176832347' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:50:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.709086877 +0000 UTC m=+0.042915262 container create f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec 01 09:50:38 compute-2 systemd[1]: Started libpod-conmon-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope.
Dec 01 09:50:38 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.690914589 +0000 UTC m=+0.024742994 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.789906624 +0000 UTC m=+0.123735039 container init f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.795897274 +0000 UTC m=+0.129725659 container start f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.799621138 +0000 UTC m=+0.133449553 container attach f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:50:38 compute-2 gracious_darwin[82113]: 167 167
Dec 01 09:50:38 compute-2 systemd[1]: libpod-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope: Deactivated successfully.
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.80127443 +0000 UTC m=+0.135102815 container died f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:50:38 compute-2 systemd[1]: var-lib-containers-storage-overlay-dc7599d3c8d303c7a6fc30573b63a7c6b83d6a542789bb1622857d7409834366-merged.mount: Deactivated successfully.
Dec 01 09:50:38 compute-2 podman[82097]: 2025-12-01 09:50:38.859370373 +0000 UTC m=+0.193198758 container remove f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:50:38 compute-2 systemd[1]: libpod-conmon-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope: Deactivated successfully.
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.012470951 +0000 UTC m=+0.041267911 container create 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:50:39 compute-2 systemd[1]: Started libpod-conmon-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope.
Dec 01 09:50:39 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:38.994466957 +0000 UTC m=+0.023263947 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.218865131 +0000 UTC m=+0.247662131 container init 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.226415082 +0000 UTC m=+0.255212052 container start 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.230287018 +0000 UTC m=+0.259083988 container attach 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:39 compute-2 ceph-mon[76053]: 7.9 scrub starts
Dec 01 09:50:39 compute-2 ceph-mon[76053]: 7.9 scrub ok
Dec 01 09:50:39 compute-2 ceph-mon[76053]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Dec 01 09:50:39 compute-2 sad_wilson[82157]: --> passed data devices: 0 physical, 1 LVM
Dec 01 09:50:39 compute-2 sad_wilson[82157]: --> All data devices are unavailable
Dec 01 09:50:39 compute-2 systemd[1]: libpod-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope: Deactivated successfully.
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.587988071 +0000 UTC m=+0.616785041 container died 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 09:50:39 compute-2 systemd[1]: var-lib-containers-storage-overlay-fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517-merged.mount: Deactivated successfully.
Dec 01 09:50:39 compute-2 podman[82140]: 2025-12-01 09:50:39.634444151 +0000 UTC m=+0.663241121 container remove 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:50:39 compute-2 systemd[1]: libpod-conmon-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope: Deactivated successfully.
Dec 01 09:50:39 compute-2 sudo[82033]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:39 compute-2 sudo[82184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:39 compute-2 sudo[82184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:39 compute-2 sudo[82184]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:39 compute-2 sudo[82209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a -- lvm list --format json
Dec 01 09:50:39 compute-2 sudo[82209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.202207536 +0000 UTC m=+0.051034347 container create 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:40 compute-2 systemd[1]: Started libpod-conmon-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope.
Dec 01 09:50:40 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.18332083 +0000 UTC m=+0.032147661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.273311267 +0000 UTC m=+0.122138098 container init 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.278261842 +0000 UTC m=+0.127088653 container start 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.281555955 +0000 UTC m=+0.130382766 container attach 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:40 compute-2 vigilant_brown[82289]: 167 167
Dec 01 09:50:40 compute-2 systemd[1]: libpod-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope: Deactivated successfully.
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.282350855 +0000 UTC m=+0.131177666 container died 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:50:40 compute-2 systemd[1]: var-lib-containers-storage-overlay-dc1110183e3569ac66d6a74b1d708fc11c6984ca526ab35315089573415cf372-merged.mount: Deactivated successfully.
Dec 01 09:50:40 compute-2 podman[82273]: 2025-12-01 09:50:40.322151558 +0000 UTC m=+0.170978369 container remove 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec 01 09:50:40 compute-2 systemd[1]: libpod-conmon-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope: Deactivated successfully.
Dec 01 09:50:40 compute-2 podman[82312]: 2025-12-01 09:50:40.479287657 +0000 UTC m=+0.043652681 container create bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 01 09:50:40 compute-2 systemd[1]: Started libpod-conmon-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope.
Dec 01 09:50:40 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:40 compute-2 podman[82312]: 2025-12-01 09:50:40.460478593 +0000 UTC m=+0.024843647 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:40 compute-2 podman[82312]: 2025-12-01 09:50:40.60997472 +0000 UTC m=+0.174339834 container init bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 01 09:50:40 compute-2 podman[82312]: 2025-12-01 09:50:40.617188101 +0000 UTC m=+0.181553125 container start bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2)
Dec 01 09:50:40 compute-2 podman[82312]: 2025-12-01 09:50:40.620922936 +0000 UTC m=+0.185287960 container attach bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec 01 09:50:40 compute-2 ceph-mon[76053]: 6.13 scrub starts
Dec 01 09:50:40 compute-2 ceph-mon[76053]: 6.13 scrub ok
Dec 01 09:50:40 compute-2 ceph-mon[76053]: pgmap v17: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s wr, 0 op/s
Dec 01 09:50:40 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:40 compute-2 elegant_goodall[82329]: {
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:     "2": [
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:         {
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "devices": [
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "/dev/loop3"
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             ],
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "lv_name": "ceph_lv0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "lv_size": "21470642176",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=365f19c2-81e5-5edd-b6b4-280555214d3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=0eea832e-1517-4443-89c1-2611993976f8,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "lv_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "name": "ceph_lv0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "tags": {
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.block_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.cephx_lockbox_secret": "",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.cluster_fsid": "365f19c2-81e5-5edd-b6b4-280555214d3a",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.cluster_name": "ceph",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.crush_device_class": "",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.encrypted": "0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.osd_fsid": "0eea832e-1517-4443-89c1-2611993976f8",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.osd_id": "2",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.type": "block",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.vdo": "0",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:                 "ceph.with_tpm": "0"
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             },
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "type": "block",
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:             "vg_name": "ceph_vg0"
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:         }
Dec 01 09:50:40 compute-2 elegant_goodall[82329]:     ]
Dec 01 09:50:40 compute-2 elegant_goodall[82329]: }
Dec 01 09:50:41 compute-2 systemd[1]: libpod-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope: Deactivated successfully.
Dec 01 09:50:41 compute-2 podman[82338]: 2025-12-01 09:50:41.075223781 +0000 UTC m=+0.031734931 container died bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:50:41 compute-2 systemd[1]: var-lib-containers-storage-overlay-34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247-merged.mount: Deactivated successfully.
Dec 01 09:50:41 compute-2 podman[82338]: 2025-12-01 09:50:41.110277364 +0000 UTC m=+0.066788504 container remove bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 01 09:50:41 compute-2 systemd[1]: libpod-conmon-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope: Deactivated successfully.
Dec 01 09:50:41 compute-2 sudo[82209]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:41 compute-2 sudo[82350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:41 compute-2 sudo[82350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:41 compute-2 sudo[82350]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:41 compute-2 sudo[82375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a -- raw list --format json
Dec 01 09:50:41 compute-2 sudo[82375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.725884865 +0000 UTC m=+0.039324522 container create 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS)
Dec 01 09:50:41 compute-2 systemd[1]: Started libpod-conmon-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope.
Dec 01 09:50:41 compute-2 ceph-mon[76053]: 6.1d scrub starts
Dec 01 09:50:41 compute-2 ceph-mon[76053]: 6.1d scrub ok
Dec 01 09:50:41 compute-2 ceph-mon[76053]: 7.13 scrub starts
Dec 01 09:50:41 compute-2 ceph-mon[76053]: 7.13 scrub ok
Dec 01 09:50:41 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.80349374 +0000 UTC m=+0.116933397 container init 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.709495642 +0000 UTC m=+0.022935299 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.808986018 +0000 UTC m=+0.122425675 container start 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.812642791 +0000 UTC m=+0.126082448 container attach 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:41 compute-2 goofy_villani[82456]: 167 167
Dec 01 09:50:41 compute-2 systemd[1]: libpod-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope: Deactivated successfully.
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.814785034 +0000 UTC m=+0.128224691 container died 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:50:41 compute-2 systemd[1]: var-lib-containers-storage-overlay-14eb8c54fe16b59699e8426e01b60251dd2a39fc6270d4cc162f366e80d5cab7-merged.mount: Deactivated successfully.
Dec 01 09:50:41 compute-2 podman[82439]: 2025-12-01 09:50:41.851968752 +0000 UTC m=+0.165408409 container remove 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:41 compute-2 systemd[1]: libpod-conmon-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope: Deactivated successfully.
Dec 01 09:50:41 compute-2 podman[82480]: 2025-12-01 09:50:41.994970504 +0000 UTC m=+0.038642985 container create 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 09:50:42 compute-2 systemd[1]: Started libpod-conmon-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope.
Dec 01 09:50:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:42 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:42.067856201 +0000 UTC m=+0.111528692 container init 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:41.979659408 +0000 UTC m=+0.023331909 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:42.077128095 +0000 UTC m=+0.120800576 container start 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:42.080390566 +0000 UTC m=+0.124063067 container attach 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:42 compute-2 lvm[82570]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 09:50:42 compute-2 lvm[82570]: VG ceph_vg0 finished
Dec 01 09:50:42 compute-2 jolly_easley[82496]: {}
Dec 01 09:50:42 compute-2 systemd[1]: libpod-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Deactivated successfully.
Dec 01 09:50:42 compute-2 systemd[1]: libpod-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Consumed 1.228s CPU time.
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:42.836906786 +0000 UTC m=+0.880579277 container died 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Dec 01 09:50:42 compute-2 ceph-mon[76053]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:50:42 compute-2 ceph-mon[76053]: pgmap v18: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:42 compute-2 ceph-mon[76053]: 6.11 scrub starts
Dec 01 09:50:42 compute-2 ceph-mon[76053]: 6.11 scrub ok
Dec 01 09:50:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f-merged.mount: Deactivated successfully.
Dec 01 09:50:42 compute-2 podman[82480]: 2025-12-01 09:50:42.914711477 +0000 UTC m=+0.958383958 container remove 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:50:42 compute-2 systemd[1]: libpod-conmon-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Deactivated successfully.
Dec 01 09:50:42 compute-2 sudo[82375]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:44 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:44 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:44 compute-2 ceph-mon[76053]: pgmap v19: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:45 compute-2 sudo[82588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:45 compute-2 sudo[82588]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:45 compute-2 sudo[82588]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:45 compute-2 sudo[82613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:45 compute-2 sudo[82613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.528108541 +0000 UTC m=+0.041302241 container create 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:45 compute-2 systemd[1]: Started libpod-conmon-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope.
Dec 01 09:50:45 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.605729017 +0000 UTC m=+0.118922747 container init 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.511047541 +0000 UTC m=+0.024241281 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.61338074 +0000 UTC m=+0.126574450 container start 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.617226957 +0000 UTC m=+0.130420687 container attach 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:45 compute-2 cool_mendel[82694]: 167 167
Dec 01 09:50:45 compute-2 systemd[1]: libpod-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope: Deactivated successfully.
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.619143235 +0000 UTC m=+0.132336965 container died 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec 01 09:50:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-af3fdec1cd3d61f90c9c539f877cd0e945541ee31cc4343a4a277cea2909c595-merged.mount: Deactivated successfully.
Dec 01 09:50:45 compute-2 podman[82678]: 2025-12-01 09:50:45.652768342 +0000 UTC m=+0.165962052 container remove 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 01 09:50:45 compute-2 systemd[1]: libpod-conmon-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope: Deactivated successfully.
Dec 01 09:50:45 compute-2 systemd[1]: Reloading.
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='client.14472 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:45 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:45 compute-2 systemd-sysv-generator[82741]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:45 compute-2 systemd-rc-local-generator[82737]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:46 compute-2 systemd[1]: Reloading.
Dec 01 09:50:46 compute-2 systemd-rc-local-generator[82780]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:46 compute-2 systemd-sysv-generator[82784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:46 compute-2 systemd[1]: Starting Ceph rgw.rgw.compute-2.ugomkp for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:50:46 compute-2 podman[82836]: 2025-12-01 09:50:46.641563385 +0000 UTC m=+0.041607650 container create 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:50:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.ugomkp supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:46 compute-2 podman[82836]: 2025-12-01 09:50:46.698464648 +0000 UTC m=+0.098508933 container init 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:50:46 compute-2 podman[82836]: 2025-12-01 09:50:46.704276604 +0000 UTC m=+0.104320869 container start 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec 01 09:50:46 compute-2 bash[82836]: 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c
Dec 01 09:50:46 compute-2 podman[82836]: 2025-12-01 09:50:46.622162206 +0000 UTC m=+0.022206491 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:46 compute-2 systemd[1]: Started Ceph rgw.rgw.compute-2.ugomkp for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:50:46 compute-2 radosgw[82855]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:50:46 compute-2 radosgw[82855]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec 01 09:50:46 compute-2 radosgw[82855]: framework: beast
Dec 01 09:50:46 compute-2 radosgw[82855]: framework conf key: endpoint, val: 192.168.122.102:8082
Dec 01 09:50:46 compute-2 radosgw[82855]: init_numa not setting numa affinity
Dec 01 09:50:46 compute-2 sudo[82613]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:46 compute-2 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-2.ugomkp on compute-2
Dec 01 09:50:46 compute-2 ceph-mon[76053]: pgmap v20: 194 pgs: 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:46 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec 01 09:50:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Dec 01 09:50:46 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1702895159' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec 01 09:50:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:48 compute-2 ceph-mon[76053]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:50:48 compute-2 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-1.alkudt on compute-1
Dec 01 09:50:48 compute-2 ceph-mon[76053]: osdmap e48: 3 total, 3 up, 3 in
Dec 01 09:50:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1702895159' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec 01 09:50:48 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec 01 09:50:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec 01 09:50:49 compute-2 ceph-mon[76053]: pgmap v22: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='client.14484 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 01 09:50:49 compute-2 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec 01 09:50:49 compute-2 ceph-mon[76053]: osdmap e49: 3 total, 3 up, 3 in
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:49 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:49 compute-2 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-0.mxrshg on compute-0
Dec 01 09:50:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec 01 09:50:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec 01 09:50:49 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 01 09:50:50 compute-2 ceph-mon[76053]: osdmap e50: 3 total, 3 up, 3 in
Dec 01 09:50:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 01 09:50:50 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 01 09:50:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 01 09:50:50 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 01 09:50:50 compute-2 ceph-mon[76053]: pgmap v25: 196 pgs: 2 unknown, 194 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:50:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3400550637' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 01 09:50:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec 01 09:50:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec 01 09:50:52 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec 01 09:50:52 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec 01 09:50:52 compute-2 ceph-mon[76053]: osdmap e51: 3 total, 3 up, 3 in
Dec 01 09:50:52 compute-2 sudo[83442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:50:52 compute-2 sudo[83442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:52 compute-2 sudo[83442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:52 compute-2 sudo[83467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:50:52 compute-2 sudo[83467]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.124976773 +0000 UTC m=+0.040645605 container create 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:50:53 compute-2 systemd[1]: Started libpod-conmon-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope.
Dec 01 09:50:53 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.107445442 +0000 UTC m=+0.023114294 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.214890319 +0000 UTC m=+0.130559181 container init 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.222299376 +0000 UTC m=+0.137968208 container start 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.225908836 +0000 UTC m=+0.141577688 container attach 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:50:53 compute-2 suspicious_mendel[83548]: 167 167
Dec 01 09:50:53 compute-2 systemd[1]: libpod-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope: Deactivated successfully.
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.22801549 +0000 UTC m=+0.143684322 container died 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Dec 01 09:50:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-b117034e8901cc2f33bafc8e62a6f674e698375a301ef1fa3885e3e82e4a4a6d-merged.mount: Deactivated successfully.
Dec 01 09:50:53 compute-2 podman[83532]: 2025-12-01 09:50:53.27011509 +0000 UTC m=+0.185783922 container remove 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:50:53 compute-2 systemd[1]: libpod-conmon-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope: Deactivated successfully.
Dec 01 09:50:53 compute-2 systemd[1]: Reloading.
Dec 01 09:50:53 compute-2 ceph-mon[76053]: pgmap v27: 196 pgs: 196 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 5.3 KiB/s rd, 1.3 KiB/s wr, 7 op/s
Dec 01 09:50:53 compute-2 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4018065374' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:53 compute-2 ceph-mon[76053]: osdmap e52: 3 total, 3 up, 3 in
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:53 compute-2 ceph-mon[76053]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 01 09:50:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:53 compute-2 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-2.yoegjc on compute-2
Dec 01 09:50:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec 01 09:50:53 compute-2 systemd-rc-local-generator[83592]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:53 compute-2 systemd-sysv-generator[83596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:53 compute-2 systemd[1]: Reloading.
Dec 01 09:50:53 compute-2 systemd-rc-local-generator[83634]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:50:53 compute-2 systemd-sysv-generator[83637]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:50:53 compute-2 systemd[1]: Starting Ceph mds.cephfs.compute-2.yoegjc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:50:54 compute-2 podman[83691]: 2025-12-01 09:50:54.10203385 +0000 UTC m=+0.040534442 container create 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc supports timestamps until 2038 (0x7fffffff)
Dec 01 09:50:54 compute-2 podman[83691]: 2025-12-01 09:50:54.16474859 +0000 UTC m=+0.103249202 container init 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:50:54 compute-2 podman[83691]: 2025-12-01 09:50:54.170846434 +0000 UTC m=+0.109347026 container start 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:50:54 compute-2 bash[83691]: 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6
Dec 01 09:50:54 compute-2 podman[83691]: 2025-12-01 09:50:54.083560505 +0000 UTC m=+0.022061117 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:50:54 compute-2 systemd[1]: Started Ceph mds.cephfs.compute-2.yoegjc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:50:54 compute-2 ceph-mds[83711]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:50:54 compute-2 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec 01 09:50:54 compute-2 ceph-mds[83711]: main not setting numa affinity
Dec 01 09:50:54 compute-2 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec 01 09:50:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: starting mds.cephfs.compute-2.yoegjc at 
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 2 from mon.1
Dec 01 09:50:54 compute-2 sudo[83467]: pam_unix(sudo:session): session closed for user root
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e3 new map
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e3 print_map
                                           e3
                                           btime 2025-12-01T09:50:54:337178+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:20.704523+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.yoegjc{-1:24223} state up:standby seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 3 from mon.1
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec 01 09:50:54 compute-2 ceph-mon[76053]: osdmap e53: 3 total, 3 up, 3 in
Dec 01 09:50:54 compute-2 ceph-mon[76053]: pgmap v30: 197 pgs: 1 unknown, 196 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 5.6 KiB/s rd, 1.4 KiB/s wr, 7 op/s
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4217549561' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 01 09:50:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec 01 09:50:54 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e4 new map
Dec 01 09:50:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e4 print_map
                                           e4
                                           btime 2025-12-01T09:50:54:367365+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:54.367356+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:creating seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 4 from mon.1
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x1
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x100
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x600
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x601
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x602
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x603
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x604
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x605
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x606
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x607
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x608
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x609
Dec 01 09:50:54 compute-2 ceph-mds[83711]: mds.0.4 creating_done
Dec 01 09:50:55 compute-2 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-0.xijran on compute-0
Dec 01 09:50:55 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] up:boot
Dec 01 09:50:55 compute-2 ceph-mon[76053]: daemon mds.cephfs.compute-2.yoegjc assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 01 09:50:55 compute-2 ceph-mon[76053]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 01 09:50:55 compute-2 ceph-mon[76053]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 01 09:50:55 compute-2 ceph-mon[76053]: fsmap cephfs:0 1 up:standby
Dec 01 09:50:55 compute-2 ceph-mon[76053]: osdmap e54: 3 total, 3 up, 3 in
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.yoegjc"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:creating}
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 01 09:50:55 compute-2 ceph-mon[76053]: daemon mds.cephfs.compute-2.yoegjc is now active in filesystem cephfs as rank 0
Dec 01 09:50:55 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e5 new map
Dec 01 09:50:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e5 print_map
                                           e5
                                           btime 2025-12-01T09:50:55:377739+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:55.377737+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24223 members: 24223
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 01 09:50:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec 01 09:50:55 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 5 from mon.1
Dec 01 09:50:55 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 01 09:50:55 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec 01 09:50:55 compute-2 ceph-mds[83711]: mds.0.4 recovery_done -- successful recovery!
Dec 01 09:50:55 compute-2 ceph-mds[83711]: mds.0.4 active_start
Dec 01 09:50:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec 01 09:50:55 compute-2 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e6 new map
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e6 print_map
                                           e6
                                           btime 2025-12-01T09:50:56:603484+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:55.377737+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24223 members: 24223
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:50:56 compute-2 ceph-mon[76053]: pgmap v32: 198 pgs: 2 unknown, 196 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.7 KiB/s rd, 1.2 KiB/s wr, 6 op/s
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 01 09:50:56 compute-2 ceph-mon[76053]: osdmap e55: 3 total, 3 up, 3 in
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] up:active
Dec 01 09:50:56 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:active}
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/167581482' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:56 compute-2 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 01 09:50:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e7 new map
Dec 01 09:50:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e7 print_map
                                           e7
                                           btime 2025-12-01T09:50:56:889938+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:55.377737+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24223 members: 24223
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:50:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:50:57 compute-2 radosgw[82855]: v1 topic migration: starting v1 topic migration..
Dec 01 09:50:57 compute-2 radosgw[82855]: LDAP not started since no server URIs were provided in the configuration.
Dec 01 09:50:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp[82851]: 2025-12-01T09:50:57.084+0000 7f24da77f980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec 01 09:50:57 compute-2 radosgw[82855]: v1 topic migration: finished v1 topic migration
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: framework: beast
Dec 01 09:50:57 compute-2 radosgw[82855]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec 01 09:50:57 compute-2 radosgw[82855]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Dec 01 09:50:57 compute-2 radosgw[82855]: starting handler: beast
Dec 01 09:50:57 compute-2 radosgw[82855]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:50:57 compute-2 radosgw[82855]: mgrc service_daemon_register rgw.24214 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.ugomkp,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=a4b474d3-e1dd-44c2-9911-e36e5f368ef5,zone_name=default,zonegroup_id=079816e3-d8ce-476e-bcdd-2df39ad7439e,zonegroup_name=default}
Dec 01 09:50:58 compute-2 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-1.ijlzoi on compute-1
Dec 01 09:50:58 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 01 09:50:58 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 01 09:50:58 compute-2 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 01 09:50:58 compute-2 ceph-mon[76053]: osdmap e56: 3 total, 3 up, 3 in
Dec 01 09:50:58 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] up:boot
Dec 01 09:50:58 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:active} 1 up:standby
Dec 01 09:50:58 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.xijran"}]: dispatch
Dec 01 09:50:58 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:active} 1 up:standby
Dec 01 09:50:59 compute-2 ceph-mon[76053]: pgmap v35: 198 pgs: 198 active+clean; 453 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 3.8 KiB/s wr, 15 op/s
Dec 01 09:50:59 compute-2 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:50:59 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 01 09:50:59 compute-2 ceph-mon[76053]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 01 09:50:59 compute-2 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec 01 09:50:59 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc-rgw
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:50:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:50:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e8 new map
Dec 01 09:50:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e8 print_map
                                           e8
                                           btime 2025-12-01T09:50:59:122025+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:55.377737+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24223 members: 24223
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:50:59 compute-2 ceph-mds[83711]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 01 09:50:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: 2025-12-01T09:50:59.378+0000 7f0d97300640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 01 09:51:00 compute-2 ceph-mon[76053]: Bind address in nfs.cephfs.0.0.compute-1.osfnzc's ganesha conf is defaulting to empty
Dec 01 09:51:00 compute-2 ceph-mon[76053]: Deploying daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1
Dec 01 09:51:00 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] up:boot
Dec 01 09:51:00 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:active} 2 up:standby
Dec 01 09:51:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.ijlzoi"}]: dispatch
Dec 01 09:51:00 compute-2 ceph-mon[76053]: pgmap v36: 198 pgs: 198 active+clean; 453 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 2.7 KiB/s wr, 10 op/s
Dec 01 09:51:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e9 new map
Dec 01 09:51:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e9 print_map
                                           e9
                                           btime 2025-12-01T09:51:01:191346+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:50:55.377737+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24223}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24223 members: 24223
                                           [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:51:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e10 new map
Dec 01 09:51:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e10 print_map
                                           e10
                                           btime 2025-12-01T09:51:01:219485+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        10
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:51:01.219484+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        57
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14532}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-0.xijran{0:14532} state up:replay seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Map removed me [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}] from cluster; respawning! See cluster/monitor logs for details.
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawn!
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command assert hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command abort hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command leak_some_memory hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perfcounters_dump hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command 1 hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perf dump hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perfcounters_schema hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perf histogram dump hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command 2 hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perf schema hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command counter dump hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command counter schema hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perf histogram schema hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command perf reset hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config show hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config help hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config set hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config unset hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config get hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config diff hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command config diff get hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command injectargs hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command log flush hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command log dump hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command log reopen hook 0x55992861ad00
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_mempools hook 0x5599293cc068
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_monmap_and_config
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: build_initial_monmap
Dec 01 09:51:01 compute-2 ceph-mds[83711]: build_initial for_mkfs: 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: monmap:
                                           epoch 0
                                           fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
                                           last_changed 2025-12-01T09:50:54.208546+0000
                                           created 2025-12-01T09:50:54.208546+0000
                                           min_mon_release 0 (unknown)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.noname-a
                                           1: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.noname-b
                                           2: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.noname-c
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: none
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: init
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: none
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command rotate-key hook 0x7ffd0a4baa98
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _reopen_session rank -1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _add_conns ranks=[1,2,0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-b con 0x559928614800 addr [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-c con 0x559928615000 addr [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-a con 0x559928615400 addr [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): authenticate will time out at 2816.095512s
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request con 0x559928615000 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth method 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth creating new auth
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more payload 9
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more responding with 36 bytes
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_done global_id 24220 payload 306
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_hunting 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: found mon.noname-c
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.noname-c at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_monmap mon_map magic: 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient:  got monmap 3 from mon.noname-c (according to old e3)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: dump:
                                           epoch 3
                                           fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
                                           last_changed 2025-12-01T09:49:23.596118+0000
                                           created 2025-12-01T09:46:48.019470+0000
                                           min_mon_release 19 (squid)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
                                           1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
                                           2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_auth 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating renewing rotating keys (they expired before 2025-12-01T09:50:24.214685+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_config config(26 keys)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_monmap mon_map magic: 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient:  got monmap 3 from mon.compute-2 (according to old e3)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: dump:
                                           epoch 3
                                           fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
                                           last_changed 2025-12-01T09:49:23.596118+0000
                                           created 2025-12-01T09:46:48.019470+0000
                                           min_mon_release 19 (squid)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
                                           1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
                                           2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: authenticate success, global_id 24220
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_monmap_and_config success
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals no callback set
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals cluster_network = 172.20.0.0/24
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals container_image = quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals log_to_file = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals mon_cluster_log_to_file = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals ms_bind_ipv4 = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals ms_bind_ipv6 = false
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals osd_pool_default_size = 1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals public_network = 192.168.122.0/24
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_accepted_admin_roles = ResellerAdmin, swiftoperator
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_accepted_roles = member, Member, admin
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_admin_domain = default
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_admin_password = 12345678
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_admin_project = service
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_admin_user = swift
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_api_version = 3
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_implicit_tenants = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_url = https://keystone-internal.openstack.svc:5000
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_keystone_verify_ssl = false
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_max_attr_name_len = 128
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_max_attr_size = 1024
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_max_attrs_num_in_req = 90
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_s3_auth_use_keystone = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_swift_account_in_url = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_swift_enforce_content_length = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_swift_versioning_enabled = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals rgw_trust_forwarded_https = true
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_auth 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:24.215094+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: shutdown
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) unregister_commands rotate-key
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set uid:gid to 167:167 (ceph:ceph)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: main not setting numa affinity
Dec 01 09:51:01 compute-2 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) init /var/run/ceph/ceph-mds.cephfs.compute-2.yoegjc.asok
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) bind_and_listen /var/run/ceph/ceph-mds.cephfs.compute-2.yoegjc.asok
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command 0 hook 0x5599286717f0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command version hook 0x5599286717f0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command git_version hook 0x5599286717f0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command help hook 0x55992861aca0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command get_command_descriptions hook 0x55992861acb0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command raise hook 0x55992867ef90
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: none
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) entry start
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: build_initial_monmap
Dec 01 09:51:01 compute-2 ceph-mds[83711]: build_initial for_mkfs: 0
Dec 01 09:51:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: monmap:
                                           epoch 0
                                           fsid 00000000-0000-0000-0000-000000000000
                                           last_changed 0.000000
                                           created 0.000000
                                           min_mon_release 0 (unknown)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.noname-a
                                           1: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.noname-b
                                           2: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.noname-c
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: init
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: none
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec 01 09:51:01 compute-2 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command rotate-key hook 0x7ffd0a4bbe18
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _reopen_session rank -1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _add_conns ranks=[1,2,0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-b con 0x559928615800 addr [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-c con 0x559928615400 addr [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): picked mon.noname-a con 0x559928614800 addr [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): start opening mon connection
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request con 0x559928615400 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth method 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth creating new auth
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more payload 9
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more payload_len 9
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_reply_more responding with 36 bytes
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request con 0x559928615800 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth method 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): _init_auth creating new auth
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient(hunting): handle_auth_done global_id 24223 payload 1139
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_hunting 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: found mon.noname-c
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.noname-c at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_auth_request con 0x559928614800 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_monmap mon_map magic: 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient:  got monmap 3 from mon.noname-c (according to old e3)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: dump:
                                           epoch 3
                                           fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
                                           last_changed 2025-12-01T09:49:23.596118+0000
                                           created 2025-12-01T09:46:48.019470+0000
                                           min_mon_release 19 (squid)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
                                           1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
                                           2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_auth 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating renewing rotating keys (they expired before 2025-12-01T09:50:24.222410+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_config config(26 keys)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_monmap mon_map magic: 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient:  got monmap 3 from mon.compute-2 (according to old e3)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: dump:
                                           epoch 3
                                           fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
                                           last_changed 2025-12-01T09:49:23.596118+0000
                                           created 2025-12-01T09:46:48.019470+0000
                                           min_mon_release 19 (squid)
                                           election_strategy: 1
                                           0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
                                           1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
                                           2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: authenticate success, global_id 24223
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals no callback set
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _finish_auth 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:24.222760+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: wait_auth_rotating waiting for 30
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: wait_auth_rotating done
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command status hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command lockup hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_ops_in_flight hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command ops hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command op kill hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command op get hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_blocked_ops hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_blocked_ops_count hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_historic_ops hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_historic_ops_by_duration hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump_export_states hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub_path hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub start hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub abort hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub pause hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub resume hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command scrub status hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command tag path hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command flush_path hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command export dir hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump cache hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command cache drop hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command lock path hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command cache status hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command quiesce path hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump tree hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump loads hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump snaps hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command session ls hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command client ls hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command session evict hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command client evict hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command session kill hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command session config hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command client config hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command damage ls hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command damage rm hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command osdmap barrier hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command flush journal hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command force_readonly hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command get subtrees hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag split hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag merge hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag ls hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command openfiles ls hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump inode hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command dump dir hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command exit hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command respawn hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command quiesce db hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command heap hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command cpu_profiler hook 0x55992861bcc0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 2 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:boot seq 1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 3 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:boot -> up:standby
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:boot seq 1 rtt 0.145004
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mgrc handle_mgr_map Got map version 27
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/1316147242,v1:192.168.122.100:6801/1316147242]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1316147242,v1:192.168.122.100:6801/1316147242]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_auth_request con 0x559928788800 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mgrc handle_mgr_configure stats_period=5
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mgrc handle_mgr_configure updated stats threshold: 5
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 4 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.purge_queue operator():  data pool 7 not found in OSDMap
Dec 01 09:51:01 compute-2 ceph-mds[83711]: asok(0x559928688000) register_command objecter_requests hook 0x55992861be90
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: log_channel(cluster) update_config to_monitors: true to_syslog: false syslog_facility:  prio: info to_graylog: false graylog_host: 127.0.0.1 graylog_port: 12201)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.purge_queue operator():  data pool 7 not found in OSDMap
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.0 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:standby -> up:creating
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 boot_create
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.log create empty log
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.journaler.mdlog(ro) set_writeable
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.journaler.mdlog(rw) created blank journal at inode 0x0x200, format=1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 boot_create creating fresh hierarchy
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.log _submit_thread 4194304~28 : ELid(1)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 boot_create creating mydir hierarchy
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x100
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x600
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x601
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x602
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x603
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x604
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x605
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x606
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x607
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x608
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x609
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 boot_create creating global snaprealm
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.purge_queue create: creating
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.journaler.pq(ro) set_writeable
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.journaler.pq(rw) created blank journal at inode 0x0x500, format=1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.log _submit_thread 4194352~872 : ESubtreeMap 2 subtrees , 0 ambiguous [metablob 0x1, 2 dirs]
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: quiesce.mds.0 <quiesce_cluster_update> epoch:4 me:24223 leader:0 members:
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 261068, rss 38112, heap 198940, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_auth_request con 0x5599296c7800 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 1 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 2 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 3 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 4 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 5 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 6 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 7 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 8 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 9 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 10 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 11 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 12 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 13 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 14 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 15 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 16 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 17 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 18 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 19 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 20 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 21 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 22 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 23 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 24 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 25 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 26 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 27 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 28 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 29 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 30 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 31 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 32 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_auth_request con 0x5599296bdc00 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: get_auth_request con 0x55992961a400 auth_method 0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 33 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 34 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 35 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 36 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_get_version_reply finishing 37 version 54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 creating_done
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 request_state up:active
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:creating -> up:active
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:active seq 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 261068, rss 38484, heap 198940, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 5 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 recovery_done -- successful recovery!
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 active_start
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 set_osd_epoch_barrier: epoch=54
Dec 01 09:51:01 compute-2 ceph-mds[83711]: quiesce.mds.0 <quiesce_cluster_update> epoch:5 me:24223 leader:24223 members:24223
Dec 01 09:51:01 compute-2 ceph-mds[83711]: quiesce.mgr.0 <update_membership> starting the db mgr thread at epoch: 5
Dec 01 09:51:01 compute-2 ceph-mds[83711]: quiesce.mgr.0 <quiesce_db_thread_main> Entering the main thread
Dec 01 09:51:01 compute-2 ceph-mds[83711]: quiesce.mgr.0 <membership_upkeep> a reset of the db has been requested
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:active seq 2 rtt 1.21003
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38980, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _renew_subs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: tick
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:27.220073+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38992, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: tick
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:28.220249+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38996, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:active seq 3
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:active seq 3 rtt 0.00100002
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: handle_config config(27 keys)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals no callback set
Dec 01 09:51:01 compute-2 ceph-mds[83711]: set_mon_vals mds_join_fs = cephfs
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: tick
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:29.220413+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 39228, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: tick
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:30.220574+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 39236, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: tick
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_tickets
Dec 01 09:51:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]:    -13> 2025-12-01T09:50:59.378+0000 7f0d97300640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 01 09:51:01 compute-2 ceph-mds[83711]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:50:31.220855+0000)
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Map removed me [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}] from cluster; respawning! See cluster/monitor logs for details.
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawn!
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  e: '/usr/bin/ceph-mds'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  0: '/usr/bin/ceph-mds'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  1: '-n'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  2: 'mds.cephfs.compute-2.yoegjc'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  3: '-f'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  4: '--setuser'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  5: 'ceph'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  6: '--setgroup'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  7: 'ceph'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  8: '--default-log-to-file=false'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  9: '--default-log-to-journald=true'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  10: '--default-log-to-stderr=false'
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawning with exe /usr/bin/ceph-mds
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  exe_path /proc/self/exe
Dec 01 09:51:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: ignoring --setuser ceph since I am not root
Dec 01 09:51:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: ignoring --setgroup ceph since I am not root
Dec 01 09:51:01 compute-2 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec 01 09:51:01 compute-2 ceph-mds[83711]: main not setting numa affinity
Dec 01 09:51:01 compute-2 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec 01 09:51:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: starting mds.cephfs.compute-2.yoegjc at 
Dec 01 09:51:01 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec 01 09:51:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 01 09:51:02 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:51:02 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] up:standby
Dec 01 09:51:02 compute-2 ceph-mon[76053]: Dropping low affinity active daemon mds.cephfs.compute-2.yoegjc in favor of higher affinity standby.
Dec 01 09:51:02 compute-2 ceph-mon[76053]: Replacing daemon mds.cephfs.compute-2.yoegjc as rank 0 with standby daemon mds.cephfs.compute-0.xijran
Dec 01 09:51:02 compute-2 ceph-mon[76053]: Health check failed: 1 filesystem is degraded (FS_DEGRADED)
Dec 01 09:51:02 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-2.yoegjc=up:active} 2 up:standby
Dec 01 09:51:02 compute-2 ceph-mon[76053]: osdmap e57: 3 total, 3 up, 3 in
Dec 01 09:51:02 compute-2 ceph-mon[76053]: fsmap cephfs:1/1 {0=cephfs.compute-0.xijran=up:replay} 1 up:standby
Dec 01 09:51:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e11 new map
Dec 01 09:51:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e11 print_map
                                           e11
                                           btime 2025-12-01T09:51:02:233755+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        11
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:51:01.259526+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        57
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14532}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-0.xijran{0:14532} state up:reconnect seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:51:02 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 11 from mon.1
Dec 01 09:51:02 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec 01 09:51:03 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj
Dec 01 09:51:03 compute-2 ceph-mon[76053]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec 01 09:51:03 compute-2 ceph-mon[76053]: pgmap v38: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 212 KiB/s rd, 9.2 KiB/s wr, 399 op/s
Dec 01 09:51:03 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] up:reconnect
Dec 01 09:51:03 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] up:boot
Dec 01 09:51:03 compute-2 ceph-mon[76053]: fsmap cephfs:1/1 {0=cephfs.compute-0.xijran=up:reconnect} 2 up:standby
Dec 01 09:51:03 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.yoegjc"}]: dispatch
Dec 01 09:51:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e12 new map
Dec 01 09:51:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e12 print_map
                                           e12
                                           btime 2025-12-01T09:51:03:341186+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        12
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:51:02.346773+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        57
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14532}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-0.xijran{0:14532} state up:rejoin seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:51:04 compute-2 sudo[83797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:51:04 compute-2 sudo[83797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:04 compute-2 sudo[83797]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e13 new map
Dec 01 09:51:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).mds e13 print_map
                                           e13
                                           btime 2025-12-01T09:51:04:350567+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        13
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-01T09:50:20.704523+0000
                                           modified        2025-12-01T09:51:04.350563+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        57
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14532}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 14532 members: 14532
                                           [mds.cephfs.compute-0.xijran{0:14532} state up:active seq 5 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec 01 09:51:04 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] up:rejoin
Dec 01 09:51:04 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] up:standby
Dec 01 09:51:04 compute-2 ceph-mon[76053]: fsmap cephfs:1/1 {0=cephfs.compute-0.xijran=up:rejoin} 2 up:standby
Dec 01 09:51:04 compute-2 ceph-mon[76053]: daemon mds.cephfs.compute-0.xijran is now active in filesystem cephfs as rank 0
Dec 01 09:51:04 compute-2 ceph-mon[76053]: pgmap v39: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 160 KiB/s rd, 6.9 KiB/s wr, 301 op/s
Dec 01 09:51:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 01 09:51:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 01 09:51:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:51:04 compute-2 sudo[83822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:51:04 compute-2 sudo[83822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.257664172 +0000 UTC m=+0.053038806 container create cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:51:05 compute-2 systemd[1]: Started libpod-conmon-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope.
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.237041764 +0000 UTC m=+0.032416428 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:51:05 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.356193275 +0000 UTC m=+0.151567929 container init cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.363861788 +0000 UTC m=+0.159236422 container start cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.367344997 +0000 UTC m=+0.162719651 container attach cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:51:05 compute-2 modest_lewin[83902]: 167 167
Dec 01 09:51:05 compute-2 systemd[1]: libpod-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope: Deactivated successfully.
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.373216414 +0000 UTC m=+0.168591068 container died cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:51:05 compute-2 systemd[1]: var-lib-containers-storage-overlay-bbbbdb3abffee8d5f0d3c0c0597d84e0db8e69ffc4a6bb1062226459a7f7c1f8-merged.mount: Deactivated successfully.
Dec 01 09:51:05 compute-2 podman[83886]: 2025-12-01 09:51:05.409321064 +0000 UTC m=+0.204695698 container remove cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec 01 09:51:05 compute-2 systemd[1]: libpod-conmon-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope: Deactivated successfully.
Dec 01 09:51:05 compute-2 systemd[1]: Reloading.
Dec 01 09:51:05 compute-2 systemd-rc-local-generator[83941]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:05 compute-2 systemd-sysv-generator[83944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:05 compute-2 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec 01 09:51:05 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw
Dec 01 09:51:05 compute-2 ceph-mon[76053]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded)
Dec 01 09:51:05 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:51:05 compute-2 ceph-mon[76053]: Bind address in nfs.cephfs.1.0.compute-2.ymqwfj's ganesha conf is defaulting to empty
Dec 01 09:51:05 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:51:05 compute-2 ceph-mon[76053]: Deploying daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2
Dec 01 09:51:05 compute-2 ceph-mon[76053]: mds.? [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] up:active
Dec 01 09:51:05 compute-2 ceph-mon[76053]: fsmap cephfs:1 {0=cephfs.compute-0.xijran=up:active} 2 up:standby
Dec 01 09:51:05 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:05 compute-2 systemd[1]: Reloading.
Dec 01 09:51:05 compute-2 systemd-rc-local-generator[83983]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:05 compute-2 systemd-sysv-generator[83989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:06 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:51:06 compute-2 podman[84041]: 2025-12-01 09:51:06.303892613 +0000 UTC m=+0.044184985 container create 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec 01 09:51:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:06 compute-2 podman[84041]: 2025-12-01 09:51:06.282779641 +0000 UTC m=+0.023072033 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:51:06 compute-2 podman[84041]: 2025-12-01 09:51:06.39748502 +0000 UTC m=+0.137777412 container init 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 01 09:51:06 compute-2 podman[84041]: 2025-12-01 09:51:06.402692441 +0000 UTC m=+0.142984823 container start 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:51:06 compute-2 bash[84041]: 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8
Dec 01 09:51:06 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:51:06 compute-2 sudo[83822]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:51:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:51:06 compute-2 ceph-mon[76053]: pgmap v40: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 145 KiB/s rd, 4.5 KiB/s wr, 268 op/s
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 01 09:51:06 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:51:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:07 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu
Dec 01 09:51:07 compute-2 ceph-mon[76053]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec 01 09:51:08 compute-2 ceph-mon[76053]: pgmap v41: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 134 KiB/s rd, 4.1 KiB/s wr, 241 op/s
Dec 01 09:51:09 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 01 09:51:09 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 01 09:51:10 compute-2 ceph-mon[76053]: pgmap v42: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 134 KiB/s rd, 4.1 KiB/s wr, 241 op/s
Dec 01 09:51:10 compute-2 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec 01 09:51:10 compute-2 ceph-mon[76053]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu-rgw
Dec 01 09:51:10 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 01 09:51:10 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 01 09:51:10 compute-2 ceph-mon[76053]: Bind address in nfs.cephfs.2.0.compute-0.pytvsu's ganesha conf is defaulting to empty
Dec 01 09:51:10 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:51:10 compute-2 ceph-mon[76053]: Deploying daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0
Dec 01 09:51:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:12 compute-2 ceph-mon[76053]: pgmap v43: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 9.9 KiB/s rd, 1.2 KiB/s wr, 11 op/s
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:51:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:51:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:13 compute-2 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-1.pwynis on compute-1
Dec 01 09:51:14 compute-2 ceph-mon[76053]: pgmap v44: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1023 B/s wr, 9 op/s
Dec 01 09:51:16 compute-2 ceph-mon[76053]: pgmap v45: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1023 B/s wr, 9 op/s
Dec 01 09:51:16 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:18 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:18 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:18 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:19 compute-2 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-0.alcixd on compute-0
Dec 01 09:51:19 compute-2 ceph-mon[76053]: pgmap v46: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 2.0 KiB/s wr, 13 op/s
Dec 01 09:51:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0016e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:20 compute-2 ceph-mon[76053]: pgmap v47: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 4.6 KiB/s rd, 1.8 KiB/s wr, 6 op/s
Dec 01 09:51:20 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:51:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec 01 09:51:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:51:21 compute-2 ceph-mon[76053]: osdmap e58: 3 total, 3 up, 3 in
Dec 01 09:51:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:51:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec 01 09:51:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:23 compute-2 ceph-mon[76053]: pgmap v49: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.2 KiB/s wr, 5 op/s
Dec 01 09:51:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec 01 09:51:25 compute-2 sshd-session[84114]: Invalid user proxyuser from 45.78.219.119 port 45110
Dec 01 09:51:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:25 compute-2 sshd-session[84114]: Received disconnect from 45.78.219.119 port 45110:11: Bye Bye [preauth]
Dec 01 09:51:25 compute-2 sshd-session[84114]: Disconnected from invalid user proxyuser 45.78.219.119 port 45110 [preauth]
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:26 compute-2 ceph-mon[76053]: pgmap v50: 198 pgs: 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.2 KiB/s wr, 5 op/s
Dec 01 09:51:26 compute-2 ceph-mon[76053]: osdmap e59: 3 total, 3 up, 3 in
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:51:26 compute-2 ceph-mon[76053]: osdmap e60: 3 total, 3 up, 3 in
Dec 01 09:51:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec 01 09:51:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:51:27 compute-2 ceph-mon[76053]: pgmap v53: 229 pgs: 31 unknown, 198 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:27 compute-2 ceph-mon[76053]: osdmap e61: 3 total, 3 up, 3 in
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec 01 09:51:27 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:28 compute-2 sudo[84116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:51:28 compute-2 sudo[84116]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:28 compute-2 sudo[84116]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:28 compute-2 sudo[84141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:51:28 compute-2 sudo[84141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec 01 09:51:28 compute-2 ceph-mon[76053]: 8.16 scrub starts
Dec 01 09:51:28 compute-2 ceph-mon[76053]: 8.16 scrub ok
Dec 01 09:51:28 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:28 compute-2 ceph-mon[76053]: pgmap v55: 291 pgs: 62 unknown, 32 peering, 197 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:28 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:28 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec 01 09:51:28 compute-2 ceph-mon[76053]: osdmap e62: 3 total, 3 up, 3 in
Dec 01 09:51:28 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:30 compute-2 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-2.bdogrt on compute-2
Dec 01 09:51:30 compute-2 ceph-mon[76053]: 8.10 scrub starts
Dec 01 09:51:30 compute-2 ceph-mon[76053]: 8.10 scrub ok
Dec 01 09:51:30 compute-2 ceph-mon[76053]: 10.12 scrub starts
Dec 01 09:51:30 compute-2 ceph-mon[76053]: 10.12 scrub ok
Dec 01 09:51:30 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:30 compute-2 ceph-mon[76053]: osdmap e63: 3 total, 3 up, 3 in
Dec 01 09:51:30 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec 01 09:51:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 8.17 scrub starts
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 8.17 scrub ok
Dec 01 09:51:31 compute-2 ceph-mon[76053]: pgmap v58: 322 pgs: 93 unknown, 32 peering, 197 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 10.7 scrub starts
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 10.7 scrub ok
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 8.14 scrub starts
Dec 01 09:51:31 compute-2 ceph-mon[76053]: 8.14 scrub ok
Dec 01 09:51:31 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec 01 09:51:31 compute-2 ceph-mon[76053]: osdmap e64: 3 total, 3 up, 3 in
Dec 01 09:51:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.584835517 +0000 UTC m=+4.129886191 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.60379046 +0000 UTC m=+4.148841114 container create dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 systemd[1]: Started libpod-conmon-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope.
Dec 01 09:51:32 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.689962088 +0000 UTC m=+4.235012762 container init dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.701515473 +0000 UTC m=+4.246566117 container start dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.70456458 +0000 UTC m=+4.249615244 container attach dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 vigorous_chatelet[84321]: 0 0
Dec 01 09:51:32 compute-2 systemd[1]: libpod-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope: Deactivated successfully.
Dec 01 09:51:32 compute-2 conmon[84321]: conmon dbc2b31ef0ab44dc4c03 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope/container/memory.events
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.707585668 +0000 UTC m=+4.252636322 container died dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 systemd[1]: var-lib-containers-storage-overlay-96417fa2b9729c270b5b6736d224517de84432ad631c8c193a0906d54fa27e27-merged.mount: Deactivated successfully.
Dec 01 09:51:32 compute-2 podman[84203]: 2025-12-01 09:51:32.74958459 +0000 UTC m=+4.294635234 container remove dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec 01 09:51:32 compute-2 systemd[1]: libpod-conmon-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope: Deactivated successfully.
Dec 01 09:51:32 compute-2 systemd[1]: Reloading.
Dec 01 09:51:32 compute-2 systemd-sysv-generator[84374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:32 compute-2 systemd-rc-local-generator[84370]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:33 compute-2 systemd[1]: Reloading.
Dec 01 09:51:33 compute-2 systemd-rc-local-generator[84410]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:33 compute-2 systemd-sysv-generator[84413]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:33 compute-2 ceph-mon[76053]: 10.1f deep-scrub starts
Dec 01 09:51:33 compute-2 ceph-mon[76053]: 10.1f deep-scrub ok
Dec 01 09:51:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:33 compute-2 ceph-mon[76053]: 8.3 scrub starts
Dec 01 09:51:33 compute-2 ceph-mon[76053]: 8.3 scrub ok
Dec 01 09:51:33 compute-2 ceph-mon[76053]: pgmap v60: 353 pgs: 1 peering, 31 unknown, 321 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:33 compute-2 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.bdogrt for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:51:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:33 compute-2 podman[84465]: 2025-12-01 09:51:33.549197557 +0000 UTC m=+0.040028523 container create 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 09:51:33 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad200115cc2dd4dba0c4cd6d416202803433215a5e4e32fd0769fa3c1029afe9/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:33 compute-2 podman[84465]: 2025-12-01 09:51:33.603017589 +0000 UTC m=+0.093848585 container init 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 09:51:33 compute-2 podman[84465]: 2025-12-01 09:51:33.607926375 +0000 UTC m=+0.098757341 container start 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 09:51:33 compute-2 bash[84465]: 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c
Dec 01 09:51:33 compute-2 podman[84465]: 2025-12-01 09:51:33.532931872 +0000 UTC m=+0.023762858 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 01 09:51:33 compute-2 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.bdogrt for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:51:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/095133 (2) : New worker #1 (4) forked
Dec 01 09:51:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095133 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:51:33 compute-2 sudo[84141]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 10.1d scrub starts
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 10.1d scrub ok
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 8.2 scrub starts
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 8.2 scrub ok
Dec 01 09:51:34 compute-2 ceph-mon[76053]: osdmap e65: 3 total, 3 up, 3 in
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 10.1c scrub starts
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 10.1c scrub ok
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 8.15 scrub starts
Dec 01 09:51:34 compute-2 ceph-mon[76053]: 8.15 scrub ok
Dec 01 09:51:34 compute-2 ceph-mon[76053]: pgmap v62: 353 pgs: 1 peering, 31 unknown, 321 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001ef0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:35 compute-2 ceph-mon[76053]: 10.19 scrub starts
Dec 01 09:51:35 compute-2 ceph-mon[76053]: 10.19 scrub ok
Dec 01 09:51:35 compute-2 ceph-mon[76053]: 8.f scrub starts
Dec 01 09:51:35 compute-2 ceph-mon[76053]: 8.f scrub ok
Dec 01 09:51:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:35 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 10.1a scrub starts
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 10.1a scrub ok
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 01 09:51:37 compute-2 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-1.wzwqmm on compute-1
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 8.11 scrub starts
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 8.11 scrub ok
Dec 01 09:51:37 compute-2 ceph-mon[76053]: pgmap v63: 353 pgs: 1 peering, 31 unknown, 321 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 10.1e scrub starts
Dec 01 09:51:37 compute-2 ceph-mon[76053]: 10.1e scrub ok
Dec 01 09:51:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001ef0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 8.e scrub starts
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 8.e scrub ok
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 10.18 scrub starts
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 10.18 scrub ok
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 8.d scrub starts
Dec 01 09:51:38 compute-2 ceph-mon[76053]: 8.d scrub ok
Dec 01 09:51:38 compute-2 ceph-mon[76053]: pgmap v64: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec 01 09:51:38 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:51:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.16( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.15( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.16( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.16( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.13( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.17( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.11( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.3( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.2( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.3( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.f( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.9( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.a( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.9( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.8( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.b( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.a( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.e( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.d( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.c( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.8( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.b( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.3( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.17( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.7( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.6( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.5( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.19( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.18( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.1f( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.1c( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.13( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.11( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.4( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.13( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.7( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.9( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.3( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.2( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1e( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1a( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.18( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.17( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1d( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:39 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.17( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.9( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.13( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1a( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.1f( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1d( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.18( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.c( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.19( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.3( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1e( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.5( v 57'44 (0'0,57'44] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.2( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.7( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.b( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.6( v 65'45 lc 57'43 (0'0,65'45] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'45 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.9( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.e( v 64'51 lc 53'27 (0'0,64'51] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=64'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.11( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.15( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.8( v 53'48 (0'0,53'48] local-lis/les=66/67 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.16( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.d( v 57'44 lc 57'18 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.13( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.7( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.3( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.11( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.f( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.b( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.4( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.a( v 65'45 lc 0'0 (0'0,65'45] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'45 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.1d( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.17( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.9( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.a( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.8( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.1c( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.5( v 49'6 (0'0,49'6] local-lis/les=66/67 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.17( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.3( v 49'6 (0'0,49'6] local-lis/les=66/67 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.18( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.2( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.16( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=66/67 n=2 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.13( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.16( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.3( v 64'51 lc 53'40 (0'0,64'51] local-lis/les=66/67 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=64'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:39 compute-2 ceph-mon[76053]: 10.4 scrub starts
Dec 01 09:51:39 compute-2 ceph-mon[76053]: 10.4 scrub ok
Dec 01 09:51:39 compute-2 ceph-mon[76053]: 8.8 scrub starts
Dec 01 09:51:39 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:51:39 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:51:39 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:51:39 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec 01 09:51:39 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:51:39 compute-2 ceph-mon[76053]: osdmap e66: 3 total, 3 up, 3 in
Dec 01 09:51:39 compute-2 ceph-mon[76053]: 8.8 scrub ok
Dec 01 09:51:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814002df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec 01 09:51:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:40 compute-2 ceph-mon[76053]: 10.16 scrub starts
Dec 01 09:51:40 compute-2 ceph-mon[76053]: 10.16 scrub ok
Dec 01 09:51:40 compute-2 ceph-mon[76053]: 9.14 scrub starts
Dec 01 09:51:40 compute-2 ceph-mon[76053]: 9.14 scrub ok
Dec 01 09:51:40 compute-2 ceph-mon[76053]: osdmap e67: 3 total, 3 up, 3 in
Dec 01 09:51:40 compute-2 ceph-mon[76053]: pgmap v67: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:40 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec 01 09:51:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:51:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:51:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=62'1018 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=62'1018 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:41 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 11.15 scrub starts
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 9.6 scrub starts
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 9.6 scrub ok
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 11.15 scrub ok
Dec 01 09:51:42 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec 01 09:51:42 compute-2 ceph-mon[76053]: osdmap e68: 3 total, 3 up, 3 in
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 9.2 scrub starts
Dec 01 09:51:42 compute-2 ceph-mon[76053]: 9.2 scrub ok
Dec 01 09:51:42 compute-2 ceph-mon[76053]: pgmap v69: 353 pgs: 1 active+clean+scrubbing, 16 remapped+peering, 336 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 15 B/s, 0 objects/s recovering
Dec 01 09:51:42 compute-2 ceph-mon[76053]: osdmap e69: 3 total, 3 up, 3 in
Dec 01 09:51:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=68'1022 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814002df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:43 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Dec 01 09:51:43 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Dec 01 09:51:43 compute-2 ceph-mon[76053]: 11.0 deep-scrub starts
Dec 01 09:51:43 compute-2 ceph-mon[76053]: 11.0 deep-scrub ok
Dec 01 09:51:43 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:43 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:43 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:43 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 01 09:51:43 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 01 09:51:43 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 01 09:51:43 compute-2 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-0.gzwexr on compute-0
Dec 01 09:51:43 compute-2 ceph-mon[76053]: osdmap e70: 3 total, 3 up, 3 in
Dec 01 09:51:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:44 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.f scrub starts
Dec 01 09:51:44 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.f scrub ok
Dec 01 09:51:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:51:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:45 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Dec 01 09:51:45 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Dec 01 09:51:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:45 compute-2 sshd-session[84494]: error: kex_exchange_identification: read: Connection timed out
Dec 01 09:51:45 compute-2 sshd-session[84494]: banner exchange: Connection from 14.22.89.30 port 39616: Connection timed out
Dec 01 09:51:45 compute-2 ceph-mon[76053]: 10.13 scrub starts
Dec 01 09:51:45 compute-2 ceph-mon[76053]: 10.13 scrub ok
Dec 01 09:51:45 compute-2 ceph-mon[76053]: 11.c scrub starts
Dec 01 09:51:45 compute-2 ceph-mon[76053]: 11.c scrub ok
Dec 01 09:51:45 compute-2 ceph-mon[76053]: pgmap v72: 353 pgs: 10 peering, 1 active+clean+scrubbing, 6 remapped+peering, 336 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 566 B/s, 17 objects/s recovering
Dec 01 09:51:46 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.9 deep-scrub starts
Dec 01 09:51:46 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.9 deep-scrub ok
Dec 01 09:51:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:47 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Dec 01 09:51:47 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 10.f scrub starts
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 10.f scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.b scrub starts
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.b scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 12.15 deep-scrub starts
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 12.15 deep-scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.17 scrub starts
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.17 scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: osdmap e71: 3 total, 3 up, 3 in
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.9 deep-scrub starts
Dec 01 09:51:47 compute-2 ceph-mon[76053]: 11.9 deep-scrub ok
Dec 01 09:51:47 compute-2 ceph-mon[76053]: pgmap v74: 353 pgs: 10 peering, 1 active+clean+scrubbing, 6 remapped+peering, 336 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 216 B/s rd, 216 B/s wr, 0 op/s; 475 B/s, 13 objects/s recovering
Dec 01 09:51:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:48 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Dec 01 09:51:48 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Dec 01 09:51:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 12.f scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 12.f scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.9 deep-scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.9 deep-scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 11.d scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 11.d scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 12.d scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.13 scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.13 scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 12.d scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: osdmap e72: 3 total, 3 up, 3 in
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.c scrub starts
Dec 01 09:51:48 compute-2 ceph-mon[76053]: 9.c scrub ok
Dec 01 09:51:48 compute-2 ceph-mon[76053]: pgmap v76: 353 pgs: 8 peering, 345 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 528 B/s, 14 objects/s recovering
Dec 01 09:51:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:49 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Dec 01 09:51:49 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Dec 01 09:51:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095149 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 12.5 scrub starts
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 12.5 scrub ok
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 12.1a scrub starts
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 12.1a scrub ok
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 11.2 deep-scrub starts
Dec 01 09:51:49 compute-2 ceph-mon[76053]: 11.2 deep-scrub ok
Dec 01 09:51:50 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Dec 01 09:51:50 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Dec 01 09:51:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 12.0 scrub starts
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 12.0 scrub ok
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 8.1f scrub starts
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 8.1f scrub ok
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 9.0 scrub starts
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 9.0 scrub ok
Dec 01 09:51:50 compute-2 ceph-mon[76053]: pgmap v77: 353 pgs: 8 peering, 345 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 115 B/s, 2 objects/s recovering
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 12.1f scrub starts
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 12.1f scrub ok
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 9.18 scrub starts
Dec 01 09:51:50 compute-2 ceph-mon[76053]: 9.18 scrub ok
Dec 01 09:51:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:51 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.c deep-scrub starts
Dec 01 09:51:51 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.c deep-scrub ok
Dec 01 09:51:51 compute-2 sudo[84498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:51:51 compute-2 sudo[84498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:51 compute-2 sudo[84498]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:51 compute-2 sudo[84523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:51:51 compute-2 sudo[84523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:51:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001bb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 8.1 deep-scrub starts
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 8.1 deep-scrub ok
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 12.1b deep-scrub starts
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 12.1b deep-scrub ok
Dec 01 09:51:51 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:51 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:51 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 8.c deep-scrub starts
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 8.c deep-scrub ok
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 9.1 scrub starts
Dec 01 09:51:51 compute-2 ceph-mon[76053]: 9.1 scrub ok
Dec 01 09:51:51 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec 01 09:51:52 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Dec 01 09:51:52 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Dec 01 09:51:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec 01 09:51:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:53 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Dec 01 09:51:53 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 01 09:51:53 compute-2 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-2.vkgipv on compute-2
Dec 01 09:51:53 compute-2 ceph-mon[76053]: pgmap v78: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 164 B/s, 7 objects/s recovering
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 12.16 scrub starts
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 12.16 scrub ok
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 12.1d scrub starts
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 12.1d scrub ok
Dec 01 09:51:53 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec 01 09:51:53 compute-2 ceph-mon[76053]: osdmap e73: 3 total, 3 up, 3 in
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 8.7 scrub starts
Dec 01 09:51:53 compute-2 ceph-mon[76053]: 8.7 scrub ok
Dec 01 09:51:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:54 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Dec 01 09:51:54 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Dec 01 09:51:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec 01 09:51:54 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:54 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:54 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:54 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 12.14 deep-scrub starts
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 12.14 deep-scrub ok
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 12.2 scrub starts
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 12.2 scrub ok
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 11.6 scrub starts
Dec 01 09:51:54 compute-2 ceph-mon[76053]: 11.6 scrub ok
Dec 01 09:51:54 compute-2 ceph-mon[76053]: pgmap v80: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 164 B/s, 7 objects/s recovering
Dec 01 09:51:54 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec 01 09:51:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:55 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Dec 01 09:51:55 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Dec 01 09:51:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:56 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.b scrub starts
Dec 01 09:51:56 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.b scrub ok
Dec 01 09:51:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.1 scrub starts
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.1 scrub ok
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.1e scrub starts
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.1e scrub ok
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 9.4 scrub starts
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 9.4 scrub ok
Dec 01 09:51:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec 01 09:51:56 compute-2 ceph-mon[76053]: osdmap e74: 3 total, 3 up, 3 in
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 9.15 scrub starts
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 9.15 scrub ok
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.3 scrub starts
Dec 01 09:51:56 compute-2 ceph-mon[76053]: 12.3 scrub ok
Dec 01 09:51:56 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec 01 09:51:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=75 pruub=9.999804497s) [0] r=-1 lpr=75 pi=[69,75)/1 crt=70'1023 lcod 70'1024 mlcod 70'1024 active pruub 129.364913940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=75 pruub=9.999738693s) [0] r=-1 lpr=75 pi=[69,75)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 unknown NOTIFY pruub 129.364913940s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.364146233s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729537964s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.364106178s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729537964s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363791466s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729568481s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363774300s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729568481s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363538742s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729614258s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:56 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363523483s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729614258s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:57 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.5 deep-scrub starts
Dec 01 09:51:57 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.5 deep-scrub ok
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.195095347 +0000 UTC m=+5.577248911 container create 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, distribution-scope=public, release=1793, io.buildah.version=1.28.2, vendor=Red Hat, Inc., name=keepalived, architecture=x86_64)
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.174708796 +0000 UTC m=+5.556862390 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 01 09:51:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:51:57 compute-2 systemd[1]: Started libpod-conmon-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope.
Dec 01 09:51:57 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.468227704 +0000 UTC m=+5.850381268 container init 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, name=keepalived, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.47554239 +0000 UTC m=+5.857695954 container start 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.buildah.version=1.28.2, vcs-type=git, distribution-scope=public, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=)
Dec 01 09:51:57 compute-2 amazing_lederberg[84683]: 0 0
Dec 01 09:51:57 compute-2 systemd[1]: libpod-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope: Deactivated successfully.
Dec 01 09:51:57 compute-2 conmon[84683]: conmon 24d850b4053a41a4b182 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope/container/memory.events
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.536070464 +0000 UTC m=+5.918224048 container attach 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, vcs-type=git, release=1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.537054759 +0000 UTC m=+5.919208323 container died 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, release=1793, name=keepalived, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived)
Dec 01 09:51:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818002700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:57 compute-2 systemd[1]: var-lib-containers-storage-overlay-9f1c14ff73c9f9a6565843032bf53eb000b76d032bf0d6ae02e15d56407e5663-merged.mount: Deactivated successfully.
Dec 01 09:51:57 compute-2 podman[84587]: 2025-12-01 09:51:57.671244293 +0000 UTC m=+6.053397857 container remove 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, name=keepalived, description=keepalived for Ceph, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git)
Dec 01 09:51:57 compute-2 systemd[1]: libpod-conmon-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope: Deactivated successfully.
Dec 01 09:51:57 compute-2 systemd[1]: Reloading.
Dec 01 09:51:57 compute-2 systemd-rc-local-generator[84731]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:57 compute-2 systemd-sysv-generator[84735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:57 compute-2 ceph-mon[76053]: pgmap v82: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s; 68 B/s, 5 objects/s recovering
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.0 scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.0 scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 9.10 scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 9.10 scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.b scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.b scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 11.18 scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 11.18 scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 11.12 scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 11.12 scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec 01 09:51:57 compute-2 ceph-mon[76053]: osdmap e75: 3 total, 3 up, 3 in
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.5 deep-scrub starts
Dec 01 09:51:57 compute-2 ceph-mon[76053]: 8.5 deep-scrub ok
Dec 01 09:51:57 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec 01 09:51:58 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Dec 01 09:51:58 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Dec 01 09:51:58 compute-2 systemd[1]: Reloading.
Dec 01 09:51:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=0 lpr=76 pi=[69,76)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=0 lpr=76 pi=[69,76)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:58 compute-2 systemd-sysv-generator[84775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:51:58 compute-2 systemd-rc-local-generator[84771]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:58 compute-2 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.vkgipv for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:51:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=76/77 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[69,76)/1 crt=71'1025 lcod 70'1024 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:58 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:51:58 compute-2 podman[84826]: 2025-12-01 09:51:58.649338522 +0000 UTC m=+0.046357683 container create a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, com.redhat.component=keepalived-container, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2)
Dec 01 09:51:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/442f8458eb1d542daa2f1a0e1900ff3d951d84f094b61e6fdb644f4c54b22cd7/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:51:58 compute-2 podman[84826]: 2025-12-01 09:51:58.627359103 +0000 UTC m=+0.024378294 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 01 09:51:58 compute-2 podman[84826]: 2025-12-01 09:51:58.75938749 +0000 UTC m=+0.156406651 container init a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, vendor=Red Hat, Inc., release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2)
Dec 01 09:51:58 compute-2 podman[84826]: 2025-12-01 09:51:58.765011404 +0000 UTC m=+0.162030565 container start a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.expose-services=, description=keepalived for Ceph, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-type=git, name=keepalived, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec 01 09:51:58 compute-2 bash[84826]: a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464
Dec 01 09:51:58 compute-2 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.vkgipv for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Configuration file /etc/keepalived/keepalived.conf
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Starting VRRP child process, pid=4
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Startup complete
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: (VI_0) Entering BACKUP STATE (init)
Dec 01 09:51:58 compute-2 sudo[84523]: pam_unix(sudo:session): session closed for user root
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: VRRP_Script(check_backend) succeeded
Dec 01 09:51:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:59 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Dec 01 09:51:59 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 9.1a scrub starts
Dec 01 09:51:59 compute-2 ceph-mon[76053]: pgmap v84: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 9.1a scrub ok
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 11.1 scrub starts
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 11.1 scrub ok
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 9.7 scrub starts
Dec 01 09:51:59 compute-2 ceph-mon[76053]: 9.7 scrub ok
Dec 01 09:51:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec 01 09:51:59 compute-2 ceph-mon[76053]: osdmap e76: 3 total, 3 up, 3 in
Dec 01 09:51:59 compute-2 ceph-mon[76053]: osdmap e77: 3 total, 3 up, 3 in
Dec 01 09:51:59 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:51:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:51:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=0/0 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=72'1022 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=0/0 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=72'1022 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.833885193s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.961975098s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.833804131s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.961975098s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78 pruub=14.798194885s) [0] async=[0] r=-1 lpr=78 pi=[69,78)/1 crt=71'1025 lcod 77'1029 mlcod 77'1029 active pruub 136.926895142s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78 pruub=14.797915459s) [0] r=-1 lpr=78 pi=[69,78)/1 crt=71'1025 lcod 77'1029 mlcod 0'0 unknown NOTIFY pruub 136.926895142s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.832543373s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.962265015s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.832491875s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.962265015s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.831372261s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.962051392s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.831318855s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.962051392s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:51:59 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:00 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Dec 01 09:52:00 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Dec 01 09:52:00 compute-2 ceph-mon[76053]: 9.d scrub starts
Dec 01 09:52:00 compute-2 ceph-mon[76053]: 9.d scrub ok
Dec 01 09:52:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:00 compute-2 ceph-mon[76053]: 12.11 scrub starts
Dec 01 09:52:00 compute-2 ceph-mon[76053]: 12.11 scrub ok
Dec 01 09:52:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:00 compute-2 ceph-mon[76053]: Deploying daemon alertmanager.compute-0 on compute-0
Dec 01 09:52:00 compute-2 ceph-mon[76053]: pgmap v87: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:00 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec 01 09:52:00 compute-2 ceph-mon[76053]: osdmap e78: 3 total, 3 up, 3 in
Dec 01 09:52:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818002700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.379416466s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 active pruub 131.729568481s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.379348755s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729568481s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.378578186s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 active pruub 131.729537964s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.378518105s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729537964s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.008636475s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 active pruub 137.359725952s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.013806343s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 active pruub 137.364913940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.013780594s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 137.364913940s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.008543015s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 137.359725952s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=78/79 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=76'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:01 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Dec 01 09:52:01 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Dec 01 09:52:01 compute-2 ceph-mon[76053]: 12.9 scrub starts
Dec 01 09:52:01 compute-2 ceph-mon[76053]: 12.9 scrub ok
Dec 01 09:52:01 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec 01 09:52:01 compute-2 ceph-mon[76053]: osdmap e79: 3 total, 3 up, 3 in
Dec 01 09:52:01 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:02 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Dec 01 09:52:02 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Dec 01 09:52:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Entering MASTER STATE
Dec 01 09:52:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec 01 09:52:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Entering BACKUP STATE
Dec 01 09:52:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:02 compute-2 ceph-mon[76053]: 11.13 scrub starts
Dec 01 09:52:02 compute-2 ceph-mon[76053]: 11.13 scrub ok
Dec 01 09:52:02 compute-2 ceph-mon[76053]: 10.a scrub starts
Dec 01 09:52:02 compute-2 ceph-mon[76053]: 10.a scrub ok
Dec 01 09:52:02 compute-2 ceph-mon[76053]: pgmap v90: 353 pgs: 4 active+remapped, 4 peering, 345 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 203 B/s, 11 objects/s recovering
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:03 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Dec 01 09:52:03 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 10.0 scrub starts
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 10.0 scrub ok
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 12.7 scrub starts
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 12.7 scrub ok
Dec 01 09:52:03 compute-2 ceph-mon[76053]: osdmap e80: 3 total, 3 up, 3 in
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 10.2 deep-scrub starts
Dec 01 09:52:03 compute-2 ceph-mon[76053]: 10.2 deep-scrub ok
Dec 01 09:52:03 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec 01 09:52:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:04 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Dec 01 09:52:04 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Dec 01 09:52:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 10.8 scrub starts
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 10.8 scrub ok
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 11.8 scrub starts
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 11.8 scrub ok
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 10.15 scrub starts
Dec 01 09:52:04 compute-2 ceph-mon[76053]: 10.15 scrub ok
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-mon[76053]: osdmap e81: 3 total, 3 up, 3 in
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-mon[76053]: Regenerating cephadm self-signed grafana TLS certificates
Dec 01 09:52:04 compute-2 ceph-mon[76053]: pgmap v93: 353 pgs: 1 active+clean+scrubbing, 4 active+remapped, 4 peering, 344 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 320 B/s, 17 objects/s recovering
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec 01 09:52:04 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:05 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Dec 01 09:52:05 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665230751s) [1] async=[1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.221939087s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661449432s) [1] async=[1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.218185425s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665162086s) [1] async=[1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.221923828s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665133476s) [1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.221939087s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661360741s) [1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.218185425s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665078163s) [1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.221923828s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661050797s) [1] async=[1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.218154907s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.660984993s) [1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.218154907s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 10.10 scrub starts
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 10.10 scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 11.16 scrub starts
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 11.16 scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec 01 09:52:05 compute-2 ceph-mon[76053]: Deploying daemon grafana.compute-0 on compute-0
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 9.1b scrub starts
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 9.1b scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 10.e scrub starts
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 10.e scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 12.4 scrub starts
Dec 01 09:52:05 compute-2 ceph-mon[76053]: 12.4 scrub ok
Dec 01 09:52:05 compute-2 ceph-mon[76053]: osdmap e82: 3 total, 3 up, 3 in
Dec 01 09:52:05 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Dec 01 09:52:05 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Dec 01 09:52:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec 01 09:52:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:06 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.1d deep-scrub starts
Dec 01 09:52:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:06 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.1d deep-scrub ok
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 8.1a scrub starts
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 8.1a scrub ok
Dec 01 09:52:07 compute-2 ceph-mon[76053]: pgmap v95: 353 pgs: 1 active+clean+scrubbing, 4 active+remapped, 4 peering, 344 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 99 B/s, 5 objects/s recovering
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 11.f scrub starts
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 11.f scrub ok
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 11.19 scrub starts
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 11.19 scrub ok
Dec 01 09:52:07 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:07 compute-2 ceph-mon[76053]: osdmap e83: 3 total, 3 up, 3 in
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 9.19 scrub starts
Dec 01 09:52:07 compute-2 ceph-mon[76053]: 9.19 scrub ok
Dec 01 09:52:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:07 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Dec 01 09:52:07 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Dec 01 09:52:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 11.4 scrub starts
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 11.4 scrub ok
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 9.1d deep-scrub starts
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 9.1d deep-scrub ok
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 9.1e scrub starts
Dec 01 09:52:08 compute-2 ceph-mon[76053]: 9.1e scrub ok
Dec 01 09:52:08 compute-2 ceph-mon[76053]: pgmap v97: 353 pgs: 4 peering, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:08 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Dec 01 09:52:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:08 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Dec 01 09:52:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:09 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.8 deep-scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 11.7 scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 11.7 scrub ok
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 12.17 scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 12.17 scrub ok
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 9.1f scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 9.1f scrub ok
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 11.1b scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 11.1b scrub ok
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 8.9 scrub starts
Dec 01 09:52:09 compute-2 ceph-mon[76053]: 8.9 scrub ok
Dec 01 09:52:09 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.8 deep-scrub ok
Dec 01 09:52:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:10 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.a scrub starts
Dec 01 09:52:11 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.a scrub ok
Dec 01 09:52:11 compute-2 ceph-mon[76053]: 8.1e deep-scrub starts
Dec 01 09:52:11 compute-2 ceph-mon[76053]: 8.1e deep-scrub ok
Dec 01 09:52:11 compute-2 ceph-mon[76053]: pgmap v98: 353 pgs: 4 peering, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:12 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Dec 01 09:52:12 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Dec 01 09:52:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 11.5 scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 11.5 scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.8 deep-scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.8 deep-scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 8.1d scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 8.1d scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.a scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.a scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 11.a scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 11.a scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.1c scrub starts
Dec 01 09:52:12 compute-2 ceph-mon[76053]: 9.1c scrub ok
Dec 01 09:52:12 compute-2 ceph-mon[76053]: pgmap v99: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:12 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec 01 09:52:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec 01 09:52:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:13 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Dec 01 09:52:13 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Dec 01 09:52:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec 01 09:52:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 9.e scrub starts
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 9.e scrub ok
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 9.5 deep-scrub starts
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 9.5 deep-scrub ok
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 11.1f scrub starts
Dec 01 09:52:13 compute-2 ceph-mon[76053]: 11.1f scrub ok
Dec 01 09:52:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec 01 09:52:13 compute-2 ceph-mon[76053]: osdmap e84: 3 total, 3 up, 3 in
Dec 01 09:52:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:13 compute-2 ceph-mon[76053]: osdmap e85: 3 total, 3 up, 3 in
Dec 01 09:52:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:13 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec 01 09:52:13 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Dec 01 09:52:14 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Dec 01 09:52:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec 01 09:52:14 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.210325241s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 active pruub 145.365219116s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:14 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.210274696s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 145.365219116s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:14 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.209836960s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 active pruub 145.365158081s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:14 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.209787369s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 145.365158081s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:15 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Dec 01 09:52:15 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 11.1c deep-scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 11.1c deep-scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 9.17 scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 9.17 scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 11.10 scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 11.10 scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: pgmap v102: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.12 deep-scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.12 deep-scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:15 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:15 compute-2 ceph-mon[76053]: Deploying daemon haproxy.rgw.default.compute-0.owswdq on compute-0
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.1c scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.1c scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.13 scrub starts
Dec 01 09:52:15 compute-2 ceph-mon[76053]: 8.13 scrub ok
Dec 01 09:52:15 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec 01 09:52:15 compute-2 ceph-mon[76053]: osdmap e86: 3 total, 3 up, 3 in
Dec 01 09:52:15 compute-2 sshd-session[84857]: Accepted publickey for zuul from 192.168.122.30 port 57520 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:52:15 compute-2 systemd-logind[795]: New session 36 of user zuul.
Dec 01 09:52:15 compute-2 systemd[1]: Started Session 36 of User zuul.
Dec 01 09:52:15 compute-2 sshd-session[84857]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:52:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:16 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Dec 01 09:52:16 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Dec 01 09:52:16 compute-2 python3.9[85010]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:52:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:17 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Dec 01 09:52:17 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Dec 01 09:52:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec 01 09:52:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:17 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:17 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:17 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:17 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:18 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Dec 01 09:52:18 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Dec 01 09:52:18 compute-2 sudo[85223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkhwffemntghjuxsmdyoqxoywugnuxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582737.7591734-59-127350585221947/AnsiballZ_command.py'
Dec 01 09:52:18 compute-2 sudo[85223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 9.12 scrub starts
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 9.12 scrub ok
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 12.18 scrub starts
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 12.18 scrub ok
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 11.11 scrub starts
Dec 01 09:52:18 compute-2 ceph-mon[76053]: 11.11 scrub ok
Dec 01 09:52:18 compute-2 ceph-mon[76053]: pgmap v104: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:18 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec 01 09:52:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec 01 09:52:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:18 compute-2 python3.9[85225]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:52:18 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 88 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:18 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 88 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:18 compute-2 sudo[85232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:52:18 compute-2 sudo[85232]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:18 compute-2 sudo[85232]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:18 compute-2 sudo[85258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:18 compute-2 sudo[85258]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:19 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Dec 01 09:52:19 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.040777406 +0000 UTC m=+0.021315394 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.302672457 +0000 UTC m=+0.283210415 container create 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 systemd[1]: Started libpod-conmon-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope.
Dec 01 09:52:19 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 11.1d scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 11.1d scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.13 scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.13 scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.19 scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.19 scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 8.4 scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 9.3 scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 9.3 scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.a scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.a scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec 01 09:52:19 compute-2 ceph-mon[76053]: pgmap v105: 353 pgs: 2 unknown, 2 active+remapped, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 9.16 scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 9.16 scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 8.4 scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 11.1e deep-scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: osdmap e87: 3 total, 3 up, 3 in
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.1c deep-scrub starts
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 11.1e deep-scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: 12.1c deep-scrub ok
Dec 01 09:52:19 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:19 compute-2 ceph-mon[76053]: osdmap e88: 3 total, 3 up, 3 in
Dec 01 09:52:19 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:19 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:19 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:19 compute-2 ceph-mon[76053]: Deploying daemon haproxy.rgw.default.compute-2.zubkfi on compute-2
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.438924483 +0000 UTC m=+0.419462461 container init 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.446383913 +0000 UTC m=+0.426921871 container start 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 great_banzai[85342]: 0 0
Dec 01 09:52:19 compute-2 systemd[1]: libpod-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope: Deactivated successfully.
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.454507991 +0000 UTC m=+0.435045969 container attach 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.454930121 +0000 UTC m=+0.435468069 container died 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000074s ======
Dec 01 09:52:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:19.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000074s
Dec 01 09:52:19 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec 01 09:52:19 compute-2 systemd[1]: var-lib-containers-storage-overlay-2cb6d426a6e2b43feaf9c44aa295370a5410ff99ddc4b0f4c074a3330597d4ef-merged.mount: Deactivated successfully.
Dec 01 09:52:19 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.533324242s) [1] async=[1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 56'1015 active pruub 156.842239380s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:19 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.533211708s) [1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 156.842239380s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:19 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.532452583s) [1] async=[1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 56'1015 active pruub 156.842803955s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:19 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.532235146s) [1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 156.842803955s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:19 compute-2 podman[85326]: 2025-12-01 09:52:19.890691017 +0000 UTC m=+0.871228975 container remove 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec 01 09:52:19 compute-2 systemd[1]: libpod-conmon-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope: Deactivated successfully.
Dec 01 09:52:20 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.e deep-scrub starts
Dec 01 09:52:20 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.e deep-scrub ok
Dec 01 09:52:20 compute-2 systemd[1]: Reloading.
Dec 01 09:52:20 compute-2 systemd-rc-local-generator[85387]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:52:20 compute-2 systemd-sysv-generator[85390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:52:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 8.6 scrub starts
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 8.6 scrub ok
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 11.1a deep-scrub starts
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 11.1a deep-scrub ok
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 12.e scrub starts
Dec 01 09:52:20 compute-2 ceph-mon[76053]: 12.e scrub ok
Dec 01 09:52:20 compute-2 ceph-mon[76053]: osdmap e89: 3 total, 3 up, 3 in
Dec 01 09:52:20 compute-2 ceph-mon[76053]: pgmap v109: 353 pgs: 2 unknown, 2 active+remapped, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:20 compute-2 systemd[1]: Reloading.
Dec 01 09:52:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec 01 09:52:20 compute-2 systemd-rc-local-generator[85433]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:52:20 compute-2 systemd-sysv-generator[85438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:52:20 compute-2 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.zubkfi for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:52:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:21 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.b scrub starts
Dec 01 09:52:21 compute-2 podman[85492]: 2025-12-01 09:52:21.039883722 +0000 UTC m=+0.045749639 container create 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec 01 09:52:21 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.b scrub ok
Dec 01 09:52:21 compute-2 systemd[80431]: Starting Mark boot as successful...
Dec 01 09:52:21 compute-2 systemd[80431]: Finished Mark boot as successful.
Dec 01 09:52:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b7c76bd36af17a087c0e7faeb06bf805d3a70a33f6b2fcb50ab5d80a5252243/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec 01 09:52:21 compute-2 podman[85492]: 2025-12-01 09:52:21.018776373 +0000 UTC m=+0.024642320 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 01 09:52:21 compute-2 podman[85492]: 2025-12-01 09:52:21.128253856 +0000 UTC m=+0.134119813 container init 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec 01 09:52:21 compute-2 podman[85492]: 2025-12-01 09:52:21.134384843 +0000 UTC m=+0.140250770 container start 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec 01 09:52:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi[85508]: [NOTICE] 334/095221 (2) : New worker #1 (4) forked
Dec 01 09:52:21 compute-2 bash[85492]: 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b
Dec 01 09:52:21 compute-2 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.zubkfi for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:52:21 compute-2 sudo[85258]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 11.e deep-scrub starts
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 11.e deep-scrub ok
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 9.11 scrub starts
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 9.11 scrub ok
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 12.8 scrub starts
Dec 01 09:52:21 compute-2 ceph-mon[76053]: 12.8 scrub ok
Dec 01 09:52:21 compute-2 ceph-mon[76053]: osdmap e90: 3 total, 3 up, 3 in
Dec 01 09:52:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:21 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec 01 09:52:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:21.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:22 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.a scrub starts
Dec 01 09:52:22 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.a scrub ok
Dec 01 09:52:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:22.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 9.b scrub starts
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 9.b scrub ok
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 8.19 scrub starts
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 8.19 scrub ok
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 12.b scrub starts
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 12.b scrub ok
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 01 09:52:22 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 01 09:52:22 compute-2 ceph-mon[76053]: Deploying daemon keepalived.rgw.default.compute-0.jnboao on compute-0
Dec 01 09:52:22 compute-2 ceph-mon[76053]: osdmap e91: 3 total, 3 up, 3 in
Dec 01 09:52:22 compute-2 ceph-mon[76053]: pgmap v112: 353 pgs: 2 peering, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 0 B/s wr, 19 op/s; 254 B/s, 10 objects/s recovering
Dec 01 09:52:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:23 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Dec 01 09:52:23 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Dec 01 09:52:23 compute-2 sudo[85530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:52:23 compute-2 sudo[85530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:23 compute-2 sudo[85530]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:23 compute-2 sudo[85559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:23 compute-2 sudo[85559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:52:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:23.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:52:23 compute-2 podman[85625]: 2025-12-01 09:52:23.914000187 +0000 UTC m=+0.038616116 container create 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 01 09:52:23 compute-2 systemd[1]: Started libpod-conmon-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope.
Dec 01 09:52:23 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:52:23 compute-2 podman[85625]: 2025-12-01 09:52:23.895338321 +0000 UTC m=+0.019954250 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 01 09:52:23 compute-2 podman[85625]: 2025-12-01 09:52:23.99607186 +0000 UTC m=+0.120687809 container init 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, build-date=2023-02-22T09:23:20, release=1793, com.redhat.component=keepalived-container, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64)
Dec 01 09:52:24 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Dec 01 09:52:24 compute-2 podman[85625]: 2025-12-01 09:52:24.003922541 +0000 UTC m=+0.128538470 container start 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, architecture=x86_64, name=keepalived, version=2.2.4)
Dec 01 09:52:24 compute-2 crazy_hamilton[85642]: 0 0
Dec 01 09:52:24 compute-2 podman[85625]: 2025-12-01 09:52:24.010306114 +0000 UTC m=+0.134922063 container attach 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, version=2.2.4, vendor=Red Hat, Inc., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2)
Dec 01 09:52:24 compute-2 systemd[1]: libpod-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope: Deactivated successfully.
Dec 01 09:52:24 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 8.a scrub starts
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 11.14 scrub starts
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 8.a scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 11.14 scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 12.6 scrub starts
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 12.6 scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 11.3 scrub starts
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 11.3 scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 9.f scrub starts
Dec 01 09:52:24 compute-2 ceph-mon[76053]: 9.f scrub ok
Dec 01 09:52:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:24 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:24 compute-2 podman[85647]: 2025-12-01 09:52:24.086910518 +0000 UTC m=+0.043223024 container died 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph.)
Dec 01 09:52:24 compute-2 systemd[1]: var-lib-containers-storage-overlay-3c7c0a0387c8d8e9f36c446db1faf164dbd70e490a340b76148754014311b821-merged.mount: Deactivated successfully.
Dec 01 09:52:24 compute-2 podman[85647]: 2025-12-01 09:52:24.137216081 +0000 UTC m=+0.093528567 container remove 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-type=git, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, distribution-scope=public, io.openshift.expose-services=)
Dec 01 09:52:24 compute-2 systemd[1]: libpod-conmon-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope: Deactivated successfully.
Dec 01 09:52:24 compute-2 systemd[1]: Reloading.
Dec 01 09:52:24 compute-2 systemd-rc-local-generator[85693]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:52:24 compute-2 systemd-sysv-generator[85696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:52:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.372214) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744372331, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7627, "num_deletes": 257, "total_data_size": 20961070, "memory_usage": 21806464, "flush_reason": "Manual Compaction"}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744450092, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12787628, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 231, "largest_seqno": 7632, "table_properties": {"data_size": 12758195, "index_size": 18566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9797, "raw_key_size": 95840, "raw_average_key_size": 24, "raw_value_size": 12683680, "raw_average_value_size": 3247, "num_data_blocks": 818, "num_entries": 3906, "num_filter_entries": 3906, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 1764582557, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 77947 microseconds, and 29989 cpu microseconds.
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.450164) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12787628 bytes OK
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.450187) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.452009) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.452697) EVENT_LOG_v1 {"time_micros": 1764582744452653, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.453543) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20919400, prev total WAL file size 20919400, number of live WAL files 2.
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.457286) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744457418, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12789276, "oldest_snapshot_seqno": -1}
Dec 01 09:52:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:24.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:24 compute-2 systemd[1]: Reloading.
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3652 keys, 12783836 bytes, temperature: kUnknown
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744545285, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12783836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12755082, "index_size": 18532, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9157, "raw_key_size": 91600, "raw_average_key_size": 25, "raw_value_size": 12683738, "raw_average_value_size": 3473, "num_data_blocks": 818, "num_entries": 3652, "num_filter_entries": 3652, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.545691) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12783836 bytes
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.547710) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.3 rd, 145.3 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.2, 0.0 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3911, records dropped: 259 output_compression: NoCompression
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.547749) EVENT_LOG_v1 {"time_micros": 1764582744547733, "job": 4, "event": "compaction_finished", "compaction_time_micros": 88005, "compaction_time_cpu_micros": 32168, "output_level": 6, "num_output_files": 1, "total_output_size": 12783836, "num_input_records": 3911, "num_output_records": 3652, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744549963, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744550039, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 01 09:52:24 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.457160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:24 compute-2 systemd-rc-local-generator[85736]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:52:24 compute-2 systemd-sysv-generator[85740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:52:24 compute-2 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.pcdbyn for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:52:24 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Dec 01 09:52:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:25 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 12.10 scrub starts
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 12.10 scrub ok
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 01 09:52:25 compute-2 ceph-mon[76053]: Deploying daemon keepalived.rgw.default.compute-2.pcdbyn on compute-2
Dec 01 09:52:25 compute-2 ceph-mon[76053]: pgmap v113: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 5.3 KiB/s rd, 0 B/s wr, 10 op/s; 0 B/s, 6 objects/s recovering
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 10.1b scrub starts
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 10.1b scrub ok
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 8.1b scrub starts
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 8.1b scrub ok
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 12.12 scrub starts
Dec 01 09:52:25 compute-2 ceph-mon[76053]: 12.12 scrub ok
Dec 01 09:52:25 compute-2 podman[85793]: 2025-12-01 09:52:25.140828432 +0000 UTC m=+0.051424493 container create e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, io.openshift.expose-services=, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, distribution-scope=public, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20)
Dec 01 09:52:25 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2a502bde378f10aee377587fa839d7da9bfdad969a217533378879967d03292/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:52:25 compute-2 podman[85793]: 2025-12-01 09:52:25.204719973 +0000 UTC m=+0.115316034 container init e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 01 09:52:25 compute-2 podman[85793]: 2025-12-01 09:52:25.209824542 +0000 UTC m=+0.120420593 container start e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=keepalived for Ceph)
Dec 01 09:52:25 compute-2 bash[85793]: e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4
Dec 01 09:52:25 compute-2 podman[85793]: 2025-12-01 09:52:25.120111554 +0000 UTC m=+0.030707635 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 01 09:52:25 compute-2 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.pcdbyn for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Configuration file /etc/keepalived/keepalived.conf
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Starting VRRP child process, pid=4
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Startup complete
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: (VI_0) Entering BACKUP STATE (init)
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: VRRP_Script(check_backend) succeeded
Dec 01 09:52:25 compute-2 sudo[85559]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:52:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:25.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:52:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:25 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.c scrub starts
Dec 01 09:52:25 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.c scrub ok
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 10.1 scrub starts
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 10.1 scrub ok
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 8.18 scrub starts
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 12.c scrub starts
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 8.18 scrub ok
Dec 01 09:52:26 compute-2 ceph-mon[76053]: 12.c scrub ok
Dec 01 09:52:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:26 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:26 compute-2 ceph-mon[76053]: Deploying daemon prometheus.compute-0 on compute-0
Dec 01 09:52:26 compute-2 ceph-mon[76053]: pgmap v114: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 4.6 KiB/s rd, 0 B/s wr, 9 op/s; 0 B/s, 5 objects/s recovering
Dec 01 09:52:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:52:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:26.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:52:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:26 compute-2 sudo[85223]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:26 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Dec 01 09:52:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:27 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Dec 01 09:52:27 compute-2 sshd-session[84860]: Connection closed by 192.168.122.30 port 57520
Dec 01 09:52:27 compute-2 sshd-session[84857]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:52:27 compute-2 systemd[1]: session-36.scope: Deactivated successfully.
Dec 01 09:52:27 compute-2 systemd[1]: session-36.scope: Consumed 9.659s CPU time.
Dec 01 09:52:27 compute-2 systemd-logind[795]: Session 36 logged out. Waiting for processes to exit.
Dec 01 09:52:27 compute-2 systemd-logind[795]: Removed session 36.
Dec 01 09:52:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.c scrub starts
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.c scrub ok
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.17 deep-scrub starts
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.d scrub starts
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.17 deep-scrub ok
Dec 01 09:52:27 compute-2 ceph-mon[76053]: 10.d scrub ok
Dec 01 09:52:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:52:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:27.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:52:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:28 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Dec 01 09:52:28 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Dec 01 09:52:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:28.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.11 scrub starts
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.11 scrub ok
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.5 deep-scrub starts
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.5 deep-scrub ok
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.6 deep-scrub starts
Dec 01 09:52:28 compute-2 ceph-mon[76053]: 10.6 deep-scrub ok
Dec 01 09:52:28 compute-2 ceph-mon[76053]: pgmap v115: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 01 09:52:28 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec 01 09:52:28 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=92 pruub=10.204820633s) [1] r=-1 lpr=92 pi=[69,92)/1 crt=56'1015 mlcod 0'0 active pruub 161.365509033s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:28 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=92 pruub=10.204762459s) [1] r=-1 lpr=92 pi=[69,92)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 161.365509033s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:28 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=92 pruub=12.570460320s) [1] r=-1 lpr=92 pi=[70,92)/1 crt=56'1015 mlcod 0'0 active pruub 163.732162476s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:28 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=92 pruub=12.570431709s) [1] r=-1 lpr=92 pi=[70,92)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.732162476s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:29 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Dec 01 09:52:29 compute-2 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Dec 01 09:52:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:29 compute-2 ceph-mon[76053]: 10.3 scrub starts
Dec 01 09:52:29 compute-2 ceph-mon[76053]: 10.3 scrub ok
Dec 01 09:52:29 compute-2 ceph-mon[76053]: 10.9 scrub starts
Dec 01 09:52:29 compute-2 ceph-mon[76053]: 10.9 scrub ok
Dec 01 09:52:29 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:29 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec 01 09:52:29 compute-2 ceph-mon[76053]: osdmap e92: 3 total, 3 up, 3 in
Dec 01 09:52:29 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec 01 09:52:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.166763306s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 active pruub 163.355636597s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.166704178s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.355636597s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.169716835s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 active pruub 163.359848022s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.169694901s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.359848022s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:29 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:29.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:30.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:30 compute-2 ceph-mon[76053]: 10.14 scrub starts
Dec 01 09:52:30 compute-2 ceph-mon[76053]: 10.14 scrub ok
Dec 01 09:52:30 compute-2 ceph-mon[76053]: pgmap v117: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:30 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec 01 09:52:30 compute-2 ceph-mon[76053]: osdmap e93: 3 total, 3 up, 3 in
Dec 01 09:52:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] async=[1] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:30 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] async=[1] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:31.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:31 compute-2 ceph-mon[76053]: osdmap e94: 3 total, 3 up, 3 in
Dec 01 09:52:31 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:31 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:31 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec 01 09:52:31 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec 01 09:52:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=93/70 les/c/f=94/71/0 sis=95 pruub=14.999175072s) [1] async=[1] r=-1 lpr=95 pi=[70,95)/1 crt=56'1015 mlcod 56'1015 active pruub 169.399078369s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=93/69 les/c/f=94/70/0 sis=95 pruub=14.995428085s) [1] async=[1] r=-1 lpr=95 pi=[69,95)/1 crt=56'1015 mlcod 56'1015 active pruub 169.395416260s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=93/70 les/c/f=94/71/0 sis=95 pruub=14.999093056s) [1] r=-1 lpr=95 pi=[70,95)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 169.399078369s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=93/69 les/c/f=94/70/0 sis=95 pruub=14.995348930s) [1] r=-1 lpr=95 pi=[69,95)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 169.395416260s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec 01 09:52:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:32.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:32 compute-2 sshd-session[80446]: Connection closed by 192.168.122.100 port 53718
Dec 01 09:52:32 compute-2 sshd-session[80427]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 01 09:52:32 compute-2 systemd[1]: session-34.scope: Deactivated successfully.
Dec 01 09:52:32 compute-2 systemd[1]: session-34.scope: Consumed 28.712s CPU time.
Dec 01 09:52:32 compute-2 systemd-logind[795]: Session 34 logged out. Waiting for processes to exit.
Dec 01 09:52:32 compute-2 systemd-logind[795]: Removed session 34.
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:32.738+0000 7f47059a1140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:32.839+0000 7f47059a1140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 01 09:52:32 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:33 compute-2 ceph-mon[76053]: pgmap v120: 353 pgs: 2 unknown, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:33 compute-2 ceph-mon[76053]: osdmap e95: 3 total, 3 up, 3 in
Dec 01 09:52:33 compute-2 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec 01 09:52:33 compute-2 ceph-mon[76053]: mgrmap e28: compute-0.fospow(active, since 2m), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:52:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec 01 09:52:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.861362457s) [1] async=[1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 56'1015 active pruub 170.408294678s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.861237526s) [1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 170.408294678s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.856736183s) [1] async=[1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 56'1015 active pruub 170.404541016s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.856654167s) [1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 170.404541016s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.698409) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753698635, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 519, "num_deletes": 251, "total_data_size": 1054080, "memory_usage": 1066208, "flush_reason": "Manual Compaction"}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753705747, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 698083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7637, "largest_seqno": 8151, "table_properties": {"data_size": 695126, "index_size": 865, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6844, "raw_average_key_size": 17, "raw_value_size": 688892, "raw_average_value_size": 1770, "num_data_blocks": 37, "num_entries": 389, "num_filter_entries": 389, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582745, "oldest_key_time": 1764582745, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 7486 microseconds, and 3391 cpu microseconds.
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.705906) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 698083 bytes OK
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.705965) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707376) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707467) EVENT_LOG_v1 {"time_micros": 1764582753707451, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707506) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1050844, prev total WAL file size 1050844, number of live WAL files 2.
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.708942) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(681KB)], [15(12MB)]
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753709044, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13481919, "oldest_snapshot_seqno": -1}
Dec 01 09:52:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec 01 09:52:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:33.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3516 keys, 13035118 bytes, temperature: kUnknown
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753815330, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 13035118, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13007102, "index_size": 18114, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8837, "raw_key_size": 90839, "raw_average_key_size": 25, "raw_value_size": 12937912, "raw_average_value_size": 3679, "num_data_blocks": 782, "num_entries": 3516, "num_filter_entries": 3516, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.815680) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 13035118 bytes
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.816923) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.8 rd, 122.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(38.0) write-amplify(18.7) OK, records in: 4041, records dropped: 525 output_compression: NoCompression
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.816945) EVENT_LOG_v1 {"time_micros": 1764582753816933, "job": 6, "event": "compaction_finished", "compaction_time_micros": 106359, "compaction_time_cpu_micros": 35503, "output_level": 6, "num_output_files": 1, "total_output_size": 13035118, "num_input_records": 4041, "num_output_records": 3516, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753817161, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 01 09:52:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:33.817+0000 7f47059a1140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:52:33 compute-2 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 01 09:52:33 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753819732, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.708781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:52:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:34 compute-2 ceph-mon[76053]: osdmap e96: 3 total, 3 up, 3 in
Dec 01 09:52:34 compute-2 ceph-mon[76053]: 10.b scrub starts
Dec 01 09:52:34 compute-2 ceph-mon[76053]: 10.b scrub ok
Dec 01 09:52:34 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec 01 09:52:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:34.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.572+0000 7f47059a1140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:   from numpy import show_config as show_numpy_config
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.777+0000 7f47059a1140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.857+0000 7f47059a1140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec 01 09:52:34 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec 01 09:52:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:35.023+0000 7f47059a1140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec 01 09:52:35 compute-2 ceph-mon[76053]: osdmap e97: 3 total, 3 up, 3 in
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec 01 09:52:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec 01 09:52:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:35 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.174+0000 7f47059a1140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.425+0000 7f47059a1140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec 01 09:52:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:36.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.514+0000 7f47059a1140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.605+0000 7f47059a1140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.705+0000 7f47059a1140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.785+0000 7f47059a1140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 01 09:52:36 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.175+0000 7f47059a1140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.290+0000 7f47059a1140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec 01 09:52:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:37.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.835+0000 7f47059a1140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 01 09:52:37 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:38.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.522+0000 7f47059a1140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.618+0000 7f47059a1140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.711+0000 7f47059a1140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.897+0000 7f47059a1140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec 01 09:52:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.987+0000 7f47059a1140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 01 09:52:38 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.184+0000 7f47059a1140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec 01 09:52:39 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec 01 09:52:39 compute-2 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec 01 09:52:39 compute-2 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec 01 09:52:39 compute-2 ceph-mon[76053]: osdmap e98: 3 total, 3 up, 3 in
Dec 01 09:52:39 compute-2 ceph-mon[76053]: mgrmap e29: compute-0.fospow(active, starting, since 0.384359s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.455+0000 7f47059a1140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.783+0000 7f47059a1140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec 01 09:52:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:39.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.869+0000 7f47059a1140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: mgr load Constructed class from module: prometheus
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO root] Starting engine...
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55814c2f3860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: CherryPy Checker:
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: The Application mounted at '' has an empty config.
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec 01 09:52:39 compute-2 sshd-session[85894]: Accepted publickey for ceph-admin from 192.168.122.100 port 52216 ssh2: RSA SHA256:Z7lOpv0ev+jSExhAdyqEmkuT2VPaBkXv4p0gmi28XsM
Dec 01 09:52:39 compute-2 systemd-logind[795]: New session 37 of user ceph-admin.
Dec 01 09:52:39 compute-2 systemd[1]: Started Session 37 of User ceph-admin.
Dec 01 09:52:39 compute-2 sshd-session[85894]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec 01 09:52:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec 01 09:52:39 compute-2 ceph-mgr[76365]: [prometheus INFO root] Engine started.
Dec 01 09:52:40 compute-2 sudo[85922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:52:40 compute-2 sudo[85922]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:40 compute-2 sudo[85922]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:40 compute-2 sudo[85947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 09:52:40 compute-2 sudo[85947]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.xijran"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.ijlzoi"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.yoegjc"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-0.fospow", "id": "compute-0.fospow"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-1.ymizfm", "id": "compute-1.ymizfm"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr metadata", "who": "compute-2.kdtkls", "id": "compute-2.kdtkls"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm restarted
Dec 01 09:52:40 compute-2 ceph-mon[76053]: Standby manager daemon compute-1.ymizfm started
Dec 01 09:52:40 compute-2 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec 01 09:52:40 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls restarted
Dec 01 09:52:40 compute-2 ceph-mon[76053]: Standby manager daemon compute-2.kdtkls started
Dec 01 09:52:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:40.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:40 compute-2 podman[86044]: 2025-12-01 09:52:40.663373208 +0000 UTC m=+0.064473960 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 09:52:40 compute-2 podman[86044]: 2025-12-01 09:52:40.759734292 +0000 UTC m=+0.160835024 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default)
Dec 01 09:52:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:41 compute-2 podman[86161]: 2025-12-01 09:52:41.330841512 +0000 UTC m=+0.172105943 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:52:41 compute-2 ceph-mon[76053]: mgrmap e30: compute-0.fospow(active, since 1.5343s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:52:41 compute-2 ceph-mon[76053]: pgmap v3: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:41 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec 01 09:52:41 compute-2 podman[86186]: 2025-12-01 09:52:41.403901245 +0000 UTC m=+0.055705981 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:52:41 compute-2 podman[86161]: 2025-12-01 09:52:41.411182772 +0000 UTC m=+0.252447163 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 09:52:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec 01 09:52:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:41 compute-2 podman[86253]: 2025-12-01 09:52:41.744489737 +0000 UTC m=+0.056382651 container exec 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:52:41 compute-2 podman[86253]: 2025-12-01 09:52:41.758145333 +0000 UTC m=+0.070038217 container exec_died 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:52:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:41.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:41 compute-2 podman[86315]: 2025-12-01 09:52:41.970088506 +0000 UTC m=+0.052301715 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 09:52:41 compute-2 podman[86315]: 2025-12-01 09:52:41.983063964 +0000 UTC m=+0.065277173 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 09:52:42 compute-2 sshd-session[86394]: Accepted publickey for zuul from 192.168.122.30 port 45252 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:52:42 compute-2 systemd-logind[795]: New session 38 of user zuul.
Dec 01 09:52:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:42 compute-2 systemd[1]: Started Session 38 of User zuul.
Dec 01 09:52:42 compute-2 sshd-session[86394]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:52:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:42.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:42 compute-2 podman[86380]: 2025-12-01 09:52:42.509486326 +0000 UTC m=+0.360065854 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9)
Dec 01 09:52:42 compute-2 podman[86380]: 2025-12-01 09:52:42.527191539 +0000 UTC m=+0.377771037 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, version=2.2.4, vcs-type=git, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived)
Dec 01 09:52:42 compute-2 sudo[85947]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:42 compute-2 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Bus STARTING
Dec 01 09:52:42 compute-2 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Serving on http://192.168.122.100:8765
Dec 01 09:52:42 compute-2 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Serving on https://192.168.122.100:7150
Dec 01 09:52:42 compute-2 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Bus STARTED
Dec 01 09:52:42 compute-2 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Client ('192.168.122.100', 52120) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 01 09:52:42 compute-2 ceph-mon[76053]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec 01 09:52:42 compute-2 ceph-mon[76053]: osdmap e99: 3 total, 3 up, 3 in
Dec 01 09:52:42 compute-2 ceph-mon[76053]: mgrmap e31: compute-0.fospow(active, since 2s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:52:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:43 compute-2 python3.9[86600]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 01 09:52:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec 01 09:52:43 compute-2 sudo[86678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:52:43 compute-2 sudo[86678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:43 compute-2 sudo[86678]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:43 compute-2 sudo[86703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:52:43 compute-2 sudo[86703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-mon[76053]: pgmap v6: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-mon[76053]: osdmap e100: 3 total, 3 up, 3 in
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-mon[76053]: mgrmap e32: compute-0.fospow(active, since 4s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 01 09:52:43 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec 01 09:52:44 compute-2 sudo[86703]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:44 compute-2 sudo[86854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:52:44 compute-2 sudo[86854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:44 compute-2 sudo[86854]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:44 compute-2 sudo[86882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 01 09:52:44 compute-2 sudo[86882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:44 compute-2 python3.9[86860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:52:44 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec 01 09:52:44 compute-2 sudo[86882]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:45 compute-2 sudo[87081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmwyazdumkgutykuqytgmdnrruahceut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582765.1428692-96-49599795829221/AnsiballZ_command.py'
Dec 01 09:52:45 compute-2 sudo[87081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:45 compute-2 python3.9[87083]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec 01 09:52:45 compute-2 ceph-mon[76053]: osdmap e101: 3 total, 3 up, 3 in
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 01 09:52:45 compute-2 ceph-mon[76053]: pgmap v9: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:52:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec 01 09:52:45 compute-2 sudo[87081]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec 01 09:52:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 102 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=102) [2] r=0 lpr=102 pi=[82,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:45 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 102 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=102) [2] r=0 lpr=102 pi=[82,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:45 compute-2 sudo[87109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:52:45 compute-2 sudo[87109]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:45 compute-2 sudo[87109]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:52:46 compute-2 sudo[87134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87134]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87159]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:46 compute-2 sudo[87184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87184]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87209]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87309]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:46 compute-2 sudo[87334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87334]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 01 09:52:46 compute-2 sudo[87359]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87359]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:52:46 compute-2 sudo[87384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:46 compute-2 sudo[87384]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:52:46 compute-2 sudo[87409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87409]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87457]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:46 compute-2 sudo[87488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87488]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcnszriwwilzqqfnppklxsleyhzbpzko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582766.2269366-132-164967209269202/AnsiballZ_stat.py'
Dec 01 09:52:46 compute-2 sudo[87538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:46 compute-2 sudo[87538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87538]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:46 compute-2 ceph-mon[76053]: osdmap e102: 3 total, 3 up, 3 in
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:52:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec 01 09:52:46 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:46 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:46 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:46 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:46 compute-2 sudo[87609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87609]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 python3.9[87583]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:52:46 compute-2 sudo[87634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new
Dec 01 09:52:46 compute-2 sudo[87634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:46 compute-2 sudo[87634]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:46 compute-2 sudo[87578]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec 01 09:52:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:47 compute-2 sudo[87661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87661]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 01 09:52:47 compute-2 sudo[87704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87704]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph
Dec 01 09:52:47 compute-2 sudo[87735]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87735]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[87760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87760]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:47 compute-2 sudo[87785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87785]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[87816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87816]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[87911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[87911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87911]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:47 compute-2 sudo[87936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[87936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87936]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:47 compute-2 sudo[87978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 01 09:52:47 compute-2 sudo[87978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[87978]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[88029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:52:47 compute-2 sudo[88029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[88029]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[88092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfszbyddrijgttsvoxwpzvupggdzaten ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582767.306445-164-212479056320451/AnsiballZ_file.py'
Dec 01 09:52:47 compute-2 sudo[88092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:47 compute-2 sudo[88074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config
Dec 01 09:52:47 compute-2 sudo[88074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[88074]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[88112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[88112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[88112]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:47.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:47 compute-2 sudo[88137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:52:47 compute-2 sudo[88137]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[88137]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 python3.9[88109]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:52:47 compute-2 sudo[88162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:52:47 compute-2 sudo[88162]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:47 compute-2 sudo[88162]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:47 compute-2 sudo[88092]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:48 compute-2 sudo[88231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:52:48 compute-2 sudo[88231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:48 compute-2 sudo[88231]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:48 compute-2 sudo[88259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new
Dec 01 09:52:48 compute-2 sudo[88259]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:48 compute-2 sudo[88259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:48 compute-2 sudo[88284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-365f19c2-81e5-5edd-b6b4-280555214d3a/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring.new /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:52:48 compute-2 sudo[88284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:48 compute-2 sudo[88284]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:48 compute-2 sudo[88434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laskbyfbiuxvyckmbnnncotfxlujzkfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582768.2049644-191-99266347682758/AnsiballZ_file.py'
Dec 01 09:52:48 compute-2 sudo[88434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:52:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:48.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:52:48 compute-2 python3.9[88436]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:52:48 compute-2 sudo[88434]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:49 compute-2 python3.9[88588]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:52:49 compute-2 network[88605]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:52:49 compute-2 network[88606]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:52:49 compute-2 network[88607]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:52:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:49.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec 01 09:52:50 compute-2 ceph-mon[76053]: osdmap e103: 3 total, 3 up, 3 in
Dec 01 09:52:50 compute-2 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:52:50 compute-2 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:52:50 compute-2 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 01 09:52:50 compute-2 ceph-mon[76053]: pgmap v12: 353 pgs: 2 active+remapped, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 18 op/s; 82 B/s, 3 objects/s recovering
Dec 01 09:52:50 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec 01 09:52:50 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=104) [2] r=0 lpr=104 pi=[61,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:50.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:51 compute-2 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:52:51 compute-2 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:52:51 compute-2 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec 01 09:52:51 compute-2 ceph-mon[76053]: pgmap v13: 353 pgs: 2 active+remapped, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 0 B/s wr, 13 op/s; 58 B/s, 2 objects/s recovering
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: osdmap e104: 3 total, 3 up, 3 in
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:52:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec 01 09:52:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[61,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[61,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:51 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000056s ======
Dec 01 09:52:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:51.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Dec 01 09:52:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec 01 09:52:52 compute-2 ceph-mon[76053]: osdmap e105: 3 total, 3 up, 3 in
Dec 01 09:52:52 compute-2 ceph-mon[76053]: pgmap v16: 353 pgs: 1 active+recovering+remapped, 1 unknown, 1 active+remapped, 350 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 189 B/s rd, 0 op/s; 4/221 objects misplaced (1.810%); 40 B/s, 1 objects/s recovering
Dec 01 09:52:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec 01 09:52:52 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 106 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:52 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 106 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:52:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:52:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009fd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:53 compute-2 ceph-mon[76053]: osdmap e106: 3 total, 3 up, 3 in
Dec 01 09:52:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec 01 09:52:53 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 107 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:52:53 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 107 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:52:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:53.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec 01 09:52:54 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 108 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=107/108 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:52:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:54 compute-2 python3.9[88873]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:52:54 compute-2 ceph-mon[76053]: pgmap v18: 353 pgs: 1 active+recovering+remapped, 1 unknown, 1 active+remapped, 350 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 4/221 objects misplaced (1.810%); 36 B/s, 1 objects/s recovering
Dec 01 09:52:54 compute-2 ceph-mon[76053]: osdmap e107: 3 total, 3 up, 3 in
Dec 01 09:52:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:55 compute-2 python3.9[89024]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:52:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:55 compute-2 ceph-mon[76053]: osdmap e108: 3 total, 3 up, 3 in
Dec 01 09:52:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:52:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:55.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:56 compute-2 sudo[89054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:52:56 compute-2 sudo[89054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:56 compute-2 sudo[89054]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:56 compute-2 sudo[89100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:52:56 compute-2 sudo[89100]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:52:56 compute-2 sudo[89100]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:56 compute-2 python3.9[89229]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:52:56 compute-2 ceph-mon[76053]: pgmap v21: 353 pgs: 1 active+recovering+remapped, 1 unknown, 1 active+remapped, 350 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 4/221 objects misplaced (1.810%)
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:56 compute-2 ceph-mon[76053]: Reconfiguring mon.compute-0 (monmap changed)...
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:52:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:56 compute-2 ceph-mon[76053]: Reconfiguring daemon mon.compute-0 on compute-0
Dec 01 09:52:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:52:57 compute-2 sudo[89387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaufjczaowgbxomxiwzmvypgcfnlrdzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582777.3079443-336-254026517252934/AnsiballZ_setup.py'
Dec 01 09:52:57 compute-2 sudo[89387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:57.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:52:57 compute-2 python3.9[89389]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:57 compute-2 ceph-mon[76053]: Reconfiguring mgr.compute-0.fospow (monmap changed)...
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.fospow", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: Reconfiguring daemon mgr.compute-0.fospow on compute-0
Dec 01 09:52:57 compute-2 ceph-mon[76053]: pgmap v22: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:57 compute-2 ceph-mon[76053]: Reconfiguring crash.compute-0 (monmap changed)...
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:57 compute-2 ceph-mon[76053]: Reconfiguring daemon crash.compute-0 on compute-0
Dec 01 09:52:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec 01 09:52:58 compute-2 sudo[89387]: pam_unix(sudo:session): session closed for user root
Dec 01 09:52:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:58 compute-2 sudo[89471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycshpdxkezouzugggpobyrgyndaqsbgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582777.3079443-336-254026517252934/AnsiballZ_dnf.py'
Dec 01 09:52:58 compute-2 sudo[89471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:52:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:52:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:52:58 compute-2 python3.9[89473]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:52:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec 01 09:52:58 compute-2 ceph-mon[76053]: osdmap e109: 3 total, 3 up, 3 in
Dec 01 09:52:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:52:58 compute-2 ceph-mon[76053]: Reconfiguring osd.1 (monmap changed)...
Dec 01 09:52:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec 01 09:52:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:52:58 compute-2 ceph-mon[76053]: Reconfiguring daemon osd.1 on compute-0
Dec 01 09:52:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:52:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:52:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:52:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:52:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:59.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:00.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:00 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:00 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:00 compute-2 ceph-mon[76053]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Dec 01 09:53:00 compute-2 ceph-mon[76053]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Dec 01 09:53:00 compute-2 ceph-mon[76053]: pgmap v24: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Dec 01 09:53:00 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec 01 09:53:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec 01 09:53:00 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 110 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=110) [2] r=0 lpr=110 pi=[71,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a030 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:01.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec 01 09:53:01 compute-2 ceph-mon[76053]: osdmap e110: 3 total, 3 up, 3 in
Dec 01 09:53:01 compute-2 ceph-mon[76053]: pgmap v26: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 448 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Dec 01 09:53:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec 01 09:53:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:01 compute-2 ceph-mon[76053]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec 01 09:53:01 compute-2 ceph-mon[76053]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec 01 09:53:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec 01 09:53:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=-1 lpr=111 pi=[71,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=111 pruub=8.873264313s) [0] r=-1 lpr=111 pi=[69,111)/1 crt=56'1015 mlcod 0'0 active pruub 193.361557007s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=111 pruub=8.873211861s) [0] r=-1 lpr=111 pi=[69,111)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 193.361557007s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:02 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=-1 lpr=111 pi=[71,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:53:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:02.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec 01 09:53:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 112 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:03 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 112 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec 01 09:53:03 compute-2 ceph-mon[76053]: osdmap e111: 3 total, 3 up, 3 in
Dec 01 09:53:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:03.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec 01 09:53:04 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:04 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:04 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=113 pruub=15.465592384s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1015 mlcod 0'0 active pruub 202.135375977s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:04 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=113 pruub=15.465529442s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 202.135375977s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:04 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] async=[0] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:53:04 compute-2 ceph-mon[76053]: osdmap e112: 3 total, 3 up, 3 in
Dec 01 09:53:04 compute-2 ceph-mon[76053]: pgmap v29: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 191 B/s rd, 0 op/s
Dec 01 09:53:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec 01 09:53:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:04.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec 01 09:53:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114 pruub=14.903059006s) [0] async=[0] r=-1 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 56'1015 active pruub 202.732254028s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114 pruub=14.902947426s) [0] r=-1 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 202.732254028s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:05.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec 01 09:53:05 compute-2 ceph-mon[76053]: osdmap e113: 3 total, 3 up, 3 in
Dec 01 09:53:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec 01 09:53:05 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=113/114 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:53:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec 01 09:53:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:06.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:06 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 115 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] async=[0] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:53:06 compute-2 ceph-mon[76053]: pgmap v31: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:53:06 compute-2 ceph-mon[76053]: osdmap e114: 3 total, 3 up, 3 in
Dec 01 09:53:06 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:06 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:06 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec 01 09:53:06 compute-2 ceph-mon[76053]: osdmap e115: 3 total, 3 up, 3 in
Dec 01 09:53:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:53:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:07 compute-2 ceph-mon[76053]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec 01 09:53:07 compute-2 ceph-mon[76053]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec 01 09:53:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec 01 09:53:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:07.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116 pruub=14.599020958s) [0] async=[0] r=-1 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 56'1015 active pruub 204.976043701s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:07 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116 pruub=14.598871231s) [0] r=-1 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 204.976043701s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:08.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:08 compute-2 ceph-mon[76053]: pgmap v34: 353 pgs: 1 remapped+peering, 1 activating, 351 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 487 B/s rd, 0 op/s; 26 B/s, 0 objects/s recovering
Dec 01 09:53:08 compute-2 ceph-mon[76053]: osdmap e116: 3 total, 3 up, 3 in
Dec 01 09:53:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec 01 09:53:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:09.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:10 compute-2 ceph-mon[76053]: osdmap e117: 3 total, 3 up, 3 in
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:10 compute-2 ceph-mon[76053]: Reconfiguring crash.compute-1 (monmap changed)...
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:10 compute-2 ceph-mon[76053]: Reconfiguring daemon crash.compute-1 on compute-1
Dec 01 09:53:10 compute-2 ceph-mon[76053]: pgmap v37: 353 pgs: 1 remapped+peering, 1 activating, 351 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 27 B/s, 0 objects/s recovering
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec 01 09:53:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a0b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:10.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec 01 09:53:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:11 compute-2 ceph-mon[76053]: Reconfiguring osd.0 (monmap changed)...
Dec 01 09:53:11 compute-2 ceph-mon[76053]: Reconfiguring daemon osd.0 on compute-1
Dec 01 09:53:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec 01 09:53:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:11.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 01 09:53:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:12.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:12 compute-2 sudo[89561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:53:12 compute-2 sudo[89561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:12 compute-2 sudo[89561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:12 compute-2 sudo[89586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:53:12 compute-2 sudo[89586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:12 compute-2 ceph-mon[76053]: pgmap v38: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 523 B/s rd, 0 op/s; 37 B/s, 2 objects/s recovering
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec 01 09:53:12 compute-2 ceph-mon[76053]: osdmap e118: 3 total, 3 up, 3 in
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:12 compute-2 ceph-mon[76053]: Reconfiguring mon.compute-1 (monmap changed)...
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:12 compute-2 ceph-mon[76053]: Reconfiguring daemon mon.compute-1 on compute-1
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 01 09:53:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.079651736 +0000 UTC m=+0.042638550 container create 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:53:13 compute-2 systemd[1]: Started libpod-conmon-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope.
Dec 01 09:53:13 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.060579716 +0000 UTC m=+0.023566560 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.161871479 +0000 UTC m=+0.124858323 container init 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.168403774 +0000 UTC m=+0.131390598 container start 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.172115 +0000 UTC m=+0.135101844 container attach 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:53:13 compute-2 busy_khorana[89644]: 167 167
Dec 01 09:53:13 compute-2 systemd[1]: libpod-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope: Deactivated successfully.
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.175005612 +0000 UTC m=+0.137992456 container died 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec 01 09:53:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-f1a2b50939b7ecdede666775e55d8cdec6090441022c5e4f42f0a5866d0a70be-merged.mount: Deactivated successfully.
Dec 01 09:53:13 compute-2 podman[89627]: 2025-12-01 09:53:13.218444764 +0000 UTC m=+0.181431588 container remove 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:53:13 compute-2 systemd[1]: libpod-conmon-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope: Deactivated successfully.
Dec 01 09:53:13 compute-2 sudo[89586]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:13 compute-2 sudo[89662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:53:13 compute-2 sudo[89662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:13 compute-2 sudo[89662]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:13 compute-2 sudo[89687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:53:13 compute-2 sudo[89687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.816719855 +0000 UTC m=+0.041096487 container create 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:53:13 compute-2 systemd[1]: Started libpod-conmon-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope.
Dec 01 09:53:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:13.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:13 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.895139869 +0000 UTC m=+0.119516521 container init 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.799577459 +0000 UTC m=+0.023954101 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.901392916 +0000 UTC m=+0.125769548 container start 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:53:13 compute-2 ceph-mon[76053]: Reconfiguring mon.compute-2 (monmap changed)...
Dec 01 09:53:13 compute-2 ceph-mon[76053]: Reconfiguring daemon mon.compute-2 on compute-2
Dec 01 09:53:13 compute-2 ceph-mon[76053]: pgmap v40: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 18 B/s, 1 objects/s recovering
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:13 compute-2 ceph-mon[76053]: Reconfiguring mgr.compute-2.kdtkls (monmap changed)...
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 09:53:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:13 compute-2 ceph-mon[76053]: Reconfiguring daemon mgr.compute-2.kdtkls on compute-2
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.909991801 +0000 UTC m=+0.134368433 container attach 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec 01 09:53:13 compute-2 crazy_jemison[89744]: 167 167
Dec 01 09:53:13 compute-2 systemd[1]: libpod-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope: Deactivated successfully.
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.912885433 +0000 UTC m=+0.137262065 container died 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:53:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec 01 09:53:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-474d665aec6561cb8a612523084d2450225c416e5a188e82f583e1b1011632a1-merged.mount: Deactivated successfully.
Dec 01 09:53:13 compute-2 podman[89728]: 2025-12-01 09:53:13.956727866 +0000 UTC m=+0.181104498 container remove 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:53:13 compute-2 systemd[1]: libpod-conmon-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope: Deactivated successfully.
Dec 01 09:53:14 compute-2 sudo[89687]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:14 compute-2 sudo[89760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:53:14 compute-2 sudo[89760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:14 compute-2 sudo[89760]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:14 compute-2 sudo[89785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec 01 09:53:14 compute-2 sudo[89785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.536387669 +0000 UTC m=+0.045412760 container create 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Dec 01 09:53:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:14.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:14 compute-2 systemd[1]: Started libpod-conmon-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope.
Dec 01 09:53:14 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.605766007 +0000 UTC m=+0.114791128 container init 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.517853084 +0000 UTC m=+0.026878205 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.614393032 +0000 UTC m=+0.123418123 container start 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 09:53:14 compute-2 sad_bartik[89842]: 167 167
Dec 01 09:53:14 compute-2 systemd[1]: libpod-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope: Deactivated successfully.
Dec 01 09:53:14 compute-2 conmon[89842]: conmon 31da20ef2c3421e427ae <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope/container/memory.events
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.618256791 +0000 UTC m=+0.127281882 container attach 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2)
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.618684344 +0000 UTC m=+0.127709465 container died 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:53:14 compute-2 systemd[1]: var-lib-containers-storage-overlay-d4973813f18ad2a03e1001d77c85b5137e4631178d3363ba4580673c1b8fc917-merged.mount: Deactivated successfully.
Dec 01 09:53:14 compute-2 podman[89825]: 2025-12-01 09:53:14.662448975 +0000 UTC m=+0.171474056 container remove 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:53:14 compute-2 systemd[1]: libpod-conmon-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope: Deactivated successfully.
Dec 01 09:53:14 compute-2 sudo[89785]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec 01 09:53:15 compute-2 ceph-mon[76053]: osdmap e119: 3 total, 3 up, 3 in
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:15 compute-2 ceph-mon[76053]: Reconfiguring osd.2 (unknown last config time)...
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:15 compute-2 ceph-mon[76053]: Reconfiguring daemon osd.2 on compute-2
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Dec 01 09:53:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Dec 01 09:53:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:15.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:16 compute-2 sudo[89876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:53:16 compute-2 sudo[89876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:16 compute-2 sudo[89876]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:16 compute-2 ceph-mon[76053]: pgmap v42: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 162 B/s rd, 0 op/s; 17 B/s, 1 objects/s recovering
Dec 01 09:53:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec 01 09:53:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec 01 09:53:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a110 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:16.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec 01 09:53:17 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec 01 09:53:17 compute-2 ceph-mon[76053]: osdmap e120: 3 total, 3 up, 3 in
Dec 01 09:53:17 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec 01 09:53:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:17.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:18 compute-2 ceph-mon[76053]: pgmap v44: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:53:18 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec 01 09:53:18 compute-2 ceph-mon[76053]: osdmap e121: 3 total, 3 up, 3 in
Dec 01 09:53:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec 01 09:53:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:18.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a130 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:19 compute-2 ceph-mon[76053]: osdmap e122: 3 total, 3 up, 3 in
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:53:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:19 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec 01 09:53:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:19.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:20.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec 01 09:53:20 compute-2 ceph-mon[76053]: pgmap v47: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 385 B/s rd, 0 op/s
Dec 01 09:53:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec 01 09:53:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:20 compute-2 ceph-mon[76053]: osdmap e123: 3 total, 3 up, 3 in
Dec 01 09:53:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:53:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:53:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:53:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:21 compute-2 ceph-mon[76053]: osdmap e124: 3 total, 3 up, 3 in
Dec 01 09:53:21 compute-2 ceph-mon[76053]: pgmap v50: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 27 B/s, 1 objects/s recovering
Dec 01 09:53:21 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec 01 09:53:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:21.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec 01 09:53:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:22.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec 01 09:53:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec 01 09:53:22 compute-2 ceph-mon[76053]: osdmap e125: 3 total, 3 up, 3 in
Dec 01 09:53:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095323 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:53:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:23.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec 01 09:53:23 compute-2 ceph-mon[76053]: osdmap e126: 3 total, 3 up, 3 in
Dec 01 09:53:23 compute-2 ceph-mon[76053]: pgmap v53: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 27 B/s, 1 objects/s recovering
Dec 01 09:53:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec 01 09:53:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec 01 09:53:23 compute-2 ceph-mon[76053]: osdmap e127: 3 total, 3 up, 3 in
Dec 01 09:53:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:24.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec 01 09:53:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a170 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:53:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:25 compute-2 sudo[89952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:53:25 compute-2 sudo[89952]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:25 compute-2 sudo[89952]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:25.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:26 compute-2 ceph-mon[76053]: osdmap e128: 3 total, 3 up, 3 in
Dec 01 09:53:26 compute-2 ceph-mon[76053]: pgmap v56: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:53:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec 01 09:53:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:53:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec 01 09:53:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ed0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:26.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:27 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec 01 09:53:27 compute-2 ceph-mon[76053]: osdmap e129: 3 total, 3 up, 3 in
Dec 01 09:53:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:27.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:28.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:28 compute-2 ceph-mon[76053]: pgmap v58: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 475 B/s rd, 0 op/s; 25 B/s, 0 objects/s recovering
Dec 01 09:53:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ef0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:29.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:30.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:30 compute-2 ceph-mon[76053]: pgmap v59: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 18 B/s, 0 objects/s recovering
Dec 01 09:53:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec 01 09:53:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec 01 09:53:31 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 130 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=130) [2] r=0 lpr=130 pi=[79,130)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:53:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:31.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 01 09:53:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:32.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 01 09:53:32 compute-2 ceph-mon[76053]: pgmap v60: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 982 B/s rd, 140 B/s wr, 1 op/s; 15 B/s, 0 objects/s recovering
Dec 01 09:53:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec 01 09:53:32 compute-2 ceph-mon[76053]: osdmap e130: 3 total, 3 up, 3 in
Dec 01 09:53:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec 01 09:53:32 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 131 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=131) [2]/[1] r=-1 lpr=131 pi=[79,131)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:32 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 131 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=131) [2]/[1] r=-1 lpr=131 pi=[79,131)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec 01 09:53:33 compute-2 ceph-mon[76053]: osdmap e131: 3 total, 3 up, 3 in
Dec 01 09:53:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 132 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=132 pruub=14.396927834s) [1] r=-1 lpr=132 pi=[105,132)/1 crt=56'1015 mlcod 0'0 active pruub 230.594085693s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 132 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=132 pruub=14.396872520s) [1] r=-1 lpr=132 pi=[105,132)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 230.594085693s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:33.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:33 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 01 09:53:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 09:53:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:34.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 09:53:34 compute-2 ceph-mon[76053]: pgmap v63: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 713 B/s rd, 142 B/s wr, 0 op/s
Dec 01 09:53:34 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 01 09:53:34 compute-2 ceph-mon[76053]: osdmap e132: 3 total, 3 up, 3 in
Dec 01 09:53:34 compute-2 ceph-mon[76053]: osdmap e133: 3 total, 3 up, 3 in
Dec 01 09:53:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:53:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:53:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:34 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec 01 09:53:34 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 134 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:53:34 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 134 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] async=[1] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 01 09:53:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:35.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec 01 09:53:35 compute-2 ceph-mon[76053]: osdmap e134: 3 total, 3 up, 3 in
Dec 01 09:53:35 compute-2 ceph-mon[76053]: pgmap v67: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Dec 01 09:53:36 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 135 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=133/105 les/c/f=134/106/0 sis=135 pruub=14.986930847s) [1] async=[1] r=-1 lpr=135 pi=[105,135)/1 crt=56'1015 mlcod 56'1015 active pruub 233.425842285s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 01 09:53:36 compute-2 ceph-osd[78644]: osd.2 pg_epoch: 135 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=133/105 les/c/f=134/106/0 sis=135 pruub=14.986741066s) [1] r=-1 lpr=135 pi=[105,135)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 233.425842285s@ mbc={}] state<Start>: transitioning to Stray
Dec 01 09:53:36 compute-2 sudo[90008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:53:36 compute-2 sudo[90008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:36 compute-2 sudo[90008]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:36.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec 01 09:53:37 compute-2 ceph-mon[76053]: osdmap e135: 3 total, 3 up, 3 in
Dec 01 09:53:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:53:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:37.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:38 compute-2 ceph-mon[76053]: osdmap e136: 3 total, 3 up, 3 in
Dec 01 09:53:38 compute-2 ceph-mon[76053]: pgmap v70: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 5.4 KiB/s rd, 2.9 KiB/s wr, 8 op/s; 31 B/s, 2 objects/s recovering
Dec 01 09:53:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:38.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:39.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:40 compute-2 ceph-mon[76053]: pgmap v71: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.9 KiB/s wr, 5 op/s; 20 B/s, 1 objects/s recovering
Dec 01 09:53:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:53:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:40.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:41.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:42 compute-2 ceph-mon[76053]: pgmap v72: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.8 KiB/s wr, 5 op/s; 17 B/s, 1 objects/s recovering
Dec 01 09:53:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:42.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095343 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:53:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:43.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:44 compute-2 ceph-mon[76053]: pgmap v73: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.4 KiB/s wr, 4 op/s; 13 B/s, 1 objects/s recovering
Dec 01 09:53:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:44.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:45.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:46.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:46 compute-2 ceph-mon[76053]: pgmap v74: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 331 B/s rd, 110 B/s wr, 0 op/s
Dec 01 09:53:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003fd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:47.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:48 compute-2 ceph-mon[76053]: pgmap v75: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 399 B/s rd, 99 B/s wr, 0 op/s
Dec 01 09:53:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:48.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:49.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:50 compute-2 ceph-mon[76053]: pgmap v76: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:53:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:50 compute-2 sudo[89471]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:51 compute-2 sudo[90199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqlxvmlxncimzmodtskxjwoitoqqauip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582831.0925033-372-46805852210210/AnsiballZ_command.py'
Dec 01 09:53:51 compute-2 sudo[90199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:51 compute-2 python3.9[90201]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:53:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:51.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:52 compute-2 sudo[90199]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:52 compute-2 ceph-mon[76053]: pgmap v77: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:53:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:52.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:53 compute-2 sudo[90489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yebasluaeyjrehtnnprrbjgakpalzzvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582832.6238089-396-81532135012325/AnsiballZ_selinux.py'
Dec 01 09:53:53 compute-2 sudo[90489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:53 compute-2 python3.9[90491]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 01 09:53:53 compute-2 sudo[90489]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:53.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:54 compute-2 sudo[90642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdyddfdfnxtswrabczxjospinywunywx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582834.005226-429-189944087653653/AnsiballZ_command.py'
Dec 01 09:53:54 compute-2 sudo[90642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:54 compute-2 python3.9[90644]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 01 09:53:54 compute-2 sudo[90642]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:54 compute-2 ceph-mon[76053]: pgmap v78: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:53:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:54.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:54 compute-2 sudo[90795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyjcxoypzffrksfnnjstpspjixubzmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582834.68129-452-232978946817377/AnsiballZ_file.py'
Dec 01 09:53:54 compute-2 sudo[90795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:55 compute-2 python3.9[90797]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:53:55 compute-2 sudo[90795]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:53:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:55 compute-2 sudo[90948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffvybpgaxbpmfbsvgmyguzhfynxwikzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582835.3928595-476-36722037254097/AnsiballZ_mount.py'
Dec 01 09:53:55 compute-2 sudo[90948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:53:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:55.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:53:56 compute-2 python3.9[90950]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 01 09:53:56 compute-2 sudo[90948]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:56 compute-2 sudo[90975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:53:56 compute-2 sudo[90975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:53:56 compute-2 sudo[90975]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:56.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:56 compute-2 ceph-mon[76053]: pgmap v79: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:53:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:57 compute-2 sudo[91126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liugtsfutollabrfmhsrrwvuduczeavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582836.9741104-561-251474792203698/AnsiballZ_file.py'
Dec 01 09:53:57 compute-2 sudo[91126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:57 compute-2 python3.9[91128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:53:57 compute-2 sudo[91126]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:53:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:57.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:53:58 compute-2 sudo[91279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulnjcokavgjbbzrphhpyvsjbecyeckfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582837.7715502-585-77584870008378/AnsiballZ_stat.py'
Dec 01 09:53:58 compute-2 sudo[91279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:58 compute-2 python3.9[91281]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:53:58 compute-2 sudo[91279]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:58 compute-2 sudo[91357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuywodszpomauwxyflojhjsaqrzkpbla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582837.7715502-585-77584870008378/AnsiballZ_file.py'
Dec 01 09:53:58 compute-2 sudo[91357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec 01 09:53:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec 01 09:53:58 compute-2 python3.9[91359]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:53:58 compute-2 sudo[91357]: pam_unix(sudo:session): session closed for user root
Dec 01 09:53:58 compute-2 ceph-mon[76053]: pgmap v80: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:53:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:53:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:59 compute-2 sudo[91511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkyygmlnvlscnchegxtwjvewcwzlgtmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582839.5991008-648-40834604532060/AnsiballZ_stat.py'
Dec 01 09:53:59 compute-2 sudo[91511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:53:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:53:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:53:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:53:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:59.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:00 compute-2 python3.9[91513]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:54:00 compute-2 sudo[91511]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:00.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:00 compute-2 ceph-mon[76053]: pgmap v81: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:01 compute-2 sudo[91666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggzniicxtjkupfjunwydcquoqzdmgwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582840.6367054-687-216024223722804/AnsiballZ_getent.py'
Dec 01 09:54:01 compute-2 sudo[91666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:01 compute-2 python3.9[91668]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 01 09:54:01 compute-2 sudo[91666]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:01 compute-2 sudo[91820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poxkgeklpijmrtnclwpvvrxqicgtpvwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582841.6251435-717-26268613095120/AnsiballZ_getent.py'
Dec 01 09:54:01 compute-2 sudo[91820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:01.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:02 compute-2 python3.9[91822]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 01 09:54:02 compute-2 sudo[91820]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:02 compute-2 ceph-mon[76053]: pgmap v82: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:54:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:02.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:02 compute-2 sudo[91974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrdrumxylxqzyimfvmmmmslcncpqujbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582842.4309077-741-56231946353660/AnsiballZ_group.py'
Dec 01 09:54:02 compute-2 sudo[91974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:03 compute-2 python3.9[91976]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 09:54:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:03 compute-2 sudo[91974]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:03 compute-2 sudo[92127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuimxesxxxwriiekgpiaudqovxkjaaqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582843.6251187-767-267882533679618/AnsiballZ_file.py'
Dec 01 09:54:03 compute-2 sudo[92127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:03.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:04 compute-2 python3.9[92129]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 01 09:54:04 compute-2 sudo[92127]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:04.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:04 compute-2 sudo[92280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yndawscyykepqneojjitaeljbjrykhyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582844.6367197-801-20862033806175/AnsiballZ_dnf.py'
Dec 01 09:54:04 compute-2 sudo[92280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:05 compute-2 python3.9[92282]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:05 compute-2 ceph-mon[76053]: pgmap v83: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:05.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:06 compute-2 ceph-mon[76053]: pgmap v84: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:06.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:06 compute-2 sudo[92280]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:07 compute-2 sudo[92435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyvydoukgcuazhqjlpgkuquarwiqvdfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582846.8781576-825-133398176470426/AnsiballZ_file.py'
Dec 01 09:54:07 compute-2 sudo[92435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:07 compute-2 python3.9[92437]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:54:07 compute-2 sudo[92435]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:07 compute-2 sudo[92588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqbulaiesuqdhsoicrkeeoikajhkasyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582847.610074-849-92873386754488/AnsiballZ_stat.py'
Dec 01 09:54:07 compute-2 sudo[92588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:07.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:08 compute-2 python3.9[92590]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:54:08 compute-2 sudo[92588]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:08 compute-2 sudo[92666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efruhgobxoqgbxtezisozuyppgmpwizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582847.610074-849-92873386754488/AnsiballZ_file.py'
Dec 01 09:54:08 compute-2 sudo[92666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:08 compute-2 ceph-mon[76053]: pgmap v85: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:54:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:08 compute-2 python3.9[92668]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:54:08 compute-2 sudo[92666]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:08.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:09 compute-2 sudo[92819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhvkkzrjasehfqkyaqgnvrjfvlkjqpbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582848.7870696-888-270214030302552/AnsiballZ_stat.py'
Dec 01 09:54:09 compute-2 sudo[92819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:09 compute-2 python3.9[92821]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:54:09 compute-2 sudo[92819]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:09 compute-2 sudo[92898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxsylbdqngkvxwfftxdfufgkfrheblb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582848.7870696-888-270214030302552/AnsiballZ_file.py'
Dec 01 09:54:09 compute-2 sudo[92898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:09 compute-2 python3.9[92900]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:54:09 compute-2 sudo[92898]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:09.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:10 compute-2 ceph-mon[76053]: pgmap v86: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:54:10 compute-2 sudo[93052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnywqbqvlrqkksujkriuvkjtbltureho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582850.2730415-933-245823890293316/AnsiballZ_dnf.py'
Dec 01 09:54:10 compute-2 sudo[93052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:10.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:10 compute-2 python3.9[93054]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:11.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:12 compute-2 sudo[93052]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:12.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:13 compute-2 python3.9[93208]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:54:13 compute-2 ceph-mon[76053]: pgmap v87: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:13 compute-2 ceph-mon[76053]: mgrmap e33: compute-0.fospow(active, since 93s), standbys: compute-1.ymizfm, compute-2.kdtkls
Dec 01 09:54:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:13.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:14 compute-2 python3.9[93361]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 01 09:54:14 compute-2 ceph-mon[76053]: pgmap v88: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:14.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:14 compute-2 python3.9[93511]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:54:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:15.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:16 compute-2 sudo[93663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqqmhwqabavdubrozxdygafhotvzmrtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582855.471762-1056-51435354942300/AnsiballZ_systemd.py'
Dec 01 09:54:16 compute-2 sudo[93663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:16 compute-2 python3.9[93665]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:54:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:16 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 01 09:54:16 compute-2 sudo[93667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:54:16 compute-2 sudo[93667]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:16 compute-2 sudo[93667]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:16 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Dec 01 09:54:16 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 01 09:54:16 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 01 09:54:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:16.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:16 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 01 09:54:16 compute-2 sudo[93663]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:16 compute-2 ceph-mon[76053]: pgmap v89: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:17 compute-2 python3.9[93854]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 01 09:54:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:17.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec 01 09:54:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:18.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec 01 09:54:18 compute-2 ceph-mon[76053]: pgmap v90: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:19.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:20.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:20 compute-2 ceph-mon[76053]: pgmap v91: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:20 compute-2 sudo[94006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbazkppaoznhlqnutumeisoidccmqzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582860.5209072-1227-149376293659467/AnsiballZ_systemd.py'
Dec 01 09:54:20 compute-2 sudo[94006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:21 compute-2 python3.9[94009]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:54:21 compute-2 sudo[94006]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:21 compute-2 sudo[94162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjcmsntrptrjohowblvmnbagxtctsisn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582861.3143544-1227-115207700614311/AnsiballZ_systemd.py'
Dec 01 09:54:21 compute-2 sudo[94162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:21 compute-2 python3.9[94164]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:54:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:21 compute-2 sudo[94162]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:21 compute-2 ceph-mon[76053]: pgmap v92: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 255 B/s wr, 0 op/s
Dec 01 09:54:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:21.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:22.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:22 compute-2 sshd-session[86397]: Connection closed by 192.168.122.30 port 45252
Dec 01 09:54:22 compute-2 sshd-session[86394]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:54:22 compute-2 systemd[1]: session-38.scope: Deactivated successfully.
Dec 01 09:54:22 compute-2 systemd[1]: session-38.scope: Consumed 1min 11.538s CPU time.
Dec 01 09:54:22 compute-2 systemd-logind[795]: Session 38 logged out. Waiting for processes to exit.
Dec 01 09:54:22 compute-2 systemd-logind[795]: Removed session 38.
Dec 01 09:54:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:23.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:24 compute-2 ceph-mon[76053]: pgmap v93: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:24.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:54:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:25 compute-2 sudo[94195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:54:25 compute-2 sudo[94195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:25 compute-2 sudo[94195]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:25 compute-2 sudo[94220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:54:25 compute-2 sudo[94220]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:25.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:26 compute-2 sudo[94220]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:26 compute-2 ceph-mon[76053]: pgmap v94: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:26.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:28.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:28 compute-2 sshd-session[94278]: Accepted publickey for zuul from 192.168.122.30 port 56192 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:54:28 compute-2 systemd-logind[795]: New session 39 of user zuul.
Dec 01 09:54:28 compute-2 systemd[1]: Started Session 39 of User zuul.
Dec 01 09:54:28 compute-2 sshd-session[94278]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:54:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:28 compute-2 ceph-mon[76053]: pgmap v95: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:54:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:28.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:29 compute-2 python3.9[94432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:54:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:30.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:30 compute-2 ceph-mon[76053]: pgmap v96: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:54:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:54:30 compute-2 sudo[94587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqiumugnwmhfjhgoikikdkwldrdefrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582869.9950411-71-6061292689249/AnsiballZ_getent.py'
Dec 01 09:54:30 compute-2 sudo[94587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:30 compute-2 python3.9[94589]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 01 09:54:30 compute-2 sudo[94587]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec 01 09:54:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:30.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec 01 09:54:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:31 compute-2 sudo[94742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmqevtuvgjjwsapytovaioflbfuwtehd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582871.048125-106-125990486856425/AnsiballZ_setup.py'
Dec 01 09:54:31 compute-2 sudo[94742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:31 compute-2 python3.9[94744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:54:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:31 compute-2 sudo[94742]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:32.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:32 compute-2 sudo[94826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hevybnysvsicmdyijzyojrkeltneoxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582871.048125-106-125990486856425/AnsiballZ_dnf.py'
Dec 01 09:54:32 compute-2 sudo[94826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:32 compute-2 ceph-mon[76053]: pgmap v97: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:54:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:32 compute-2 python3.9[94828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:54:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:32.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:34 compute-2 sudo[94826]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:34.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:34 compute-2 ceph-mon[76053]: pgmap v98: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:34 compute-2 sudo[94981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnicugcnholxbdnwzaukpfuhcwejloyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582874.3242972-148-184562481096411/AnsiballZ_dnf.py'
Dec 01 09:54:34 compute-2 sudo[94981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:34 compute-2 python3.9[94983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:35 compute-2 sudo[94986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:54:35 compute-2 sudo[94986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:35 compute-2 sudo[94986]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:35 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:35 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:54:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 09:54:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:36.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 09:54:36 compute-2 sudo[94981]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:36 compute-2 sudo[95057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:54:36 compute-2 sudo[95057]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:36 compute-2 sudo[95057]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:36.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:36 compute-2 ceph-mon[76053]: pgmap v99: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:37 compute-2 sudo[95187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezegjkpcwelkanxcsiyriulmlxgfcmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582876.5049448-172-257387539973903/AnsiballZ_systemd.py'
Dec 01 09:54:37 compute-2 sudo[95187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:37 compute-2 python3.9[95189]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:54:37 compute-2 sudo[95187]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:37 compute-2 ceph-mon[76053]: pgmap v100: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:54:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:38.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:38 compute-2 python3.9[95343]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:54:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:38.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:39 compute-2 sudo[95494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npoaqhsvqbheycvracpbvhoujhskadgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582878.63161-227-97618458817966/AnsiballZ_sefcontext.py'
Dec 01 09:54:39 compute-2 sudo[95494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.355864) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879356032, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2561, "num_deletes": 252, "total_data_size": 9924562, "memory_usage": 10223456, "flush_reason": "Manual Compaction"}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 01 09:54:39 compute-2 python3.9[95496]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879386275, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6164538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8156, "largest_seqno": 10712, "table_properties": {"data_size": 6153332, "index_size": 7252, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 24268, "raw_average_key_size": 21, "raw_value_size": 6130188, "raw_average_value_size": 5330, "num_data_blocks": 321, "num_entries": 1150, "num_filter_entries": 1150, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582754, "oldest_key_time": 1764582754, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 30440 microseconds, and 13601 cpu microseconds.
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.386331) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6164538 bytes OK
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.386357) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388259) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388289) EVENT_LOG_v1 {"time_micros": 1764582879388283, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388314) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9912539, prev total WAL file size 9912539, number of live WAL files 2.
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.390539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6020KB)], [18(12MB)]
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879390682, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19199656, "oldest_snapshot_seqno": -1}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4130 keys, 14810082 bytes, temperature: kUnknown
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879500620, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14810082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14776577, "index_size": 22067, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105287, "raw_average_key_size": 25, "raw_value_size": 14695095, "raw_average_value_size": 3558, "num_data_blocks": 945, "num_entries": 4130, "num_filter_entries": 4130, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.500926) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14810082 bytes
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.502284) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.5 rd, 134.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.9, 12.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.5) write-amplify(2.4) OK, records in: 4666, records dropped: 536 output_compression: NoCompression
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.502306) EVENT_LOG_v1 {"time_micros": 1764582879502294, "job": 8, "event": "compaction_finished", "compaction_time_micros": 110031, "compaction_time_cpu_micros": 36020, "output_level": 6, "num_output_files": 1, "total_output_size": 14810082, "num_input_records": 4666, "num_output_records": 4130, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879503297, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879505327, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.390204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505489) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505491) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:54:39 compute-2 sudo[95494]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:40.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:40 compute-2 ceph-mon[76053]: pgmap v101: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:54:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:40 compute-2 python3.9[95647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:54:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:40.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:41 compute-2 sudo[95807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbhchdlicwnjxxynhbcvvuzavxesixnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582881.0150304-280-240162070028427/AnsiballZ_dnf.py'
Dec 01 09:54:41 compute-2 sudo[95807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:41 compute-2 python3.9[95809]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:42.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:42 compute-2 ceph-mon[76053]: pgmap v102: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:54:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:42.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:43 compute-2 sudo[95807]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:43 compute-2 sudo[95963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drsrxmyqkgvtohzpfldfozfnpixdlytr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582883.192356-304-47026773540513/AnsiballZ_command.py'
Dec 01 09:54:43 compute-2 sudo[95963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:43 compute-2 python3.9[95965]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:54:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:44.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98240089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:44 compute-2 ceph-mon[76053]: pgmap v103: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:44 compute-2 sudo[95963]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:44.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:45 compute-2 sshd-session[95811]: Invalid user ir from 14.22.89.30 port 41636
Dec 01 09:54:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:45 compute-2 sshd-session[95811]: Received disconnect from 14.22.89.30 port 41636:11: Bye Bye [preauth]
Dec 01 09:54:45 compute-2 sshd-session[95811]: Disconnected from invalid user ir 14.22.89.30 port 41636 [preauth]
Dec 01 09:54:45 compute-2 sudo[96253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhpqwqozabvnuhwdgymnplbwsasfigb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582884.9066377-329-221755702400678/AnsiballZ_file.py'
Dec 01 09:54:45 compute-2 sudo[96253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:45 compute-2 python3.9[96255]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 09:54:45 compute-2 sudo[96253]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:45 compute-2 sshd-session[71491]: Received disconnect from 38.102.83.143 port 55982:11: disconnected by user
Dec 01 09:54:45 compute-2 sshd-session[71491]: Disconnected from user zuul 38.102.83.143 port 55982
Dec 01 09:54:45 compute-2 sshd-session[71488]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:54:45 compute-2 systemd[1]: session-19.scope: Deactivated successfully.
Dec 01 09:54:45 compute-2 systemd[1]: session-19.scope: Consumed 8.775s CPU time.
Dec 01 09:54:45 compute-2 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Dec 01 09:54:45 compute-2 systemd-logind[795]: Removed session 19.
Dec 01 09:54:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:54:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:46.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:46 compute-2 python3.9[96405]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:54:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000f30 fd 48 proxy ignored for local
Dec 01 09:54:46 compute-2 kernel: ganesha.nfsd[95754]: segfault at 50 ip 00007f98d405932e sp 00007f98877fd210 error 4 in libntirpc.so.5.8[7f98d403e000+2c000] likely on CPU 6 (core 0, socket 6)
Dec 01 09:54:46 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 09:54:46 compute-2 ceph-mon[76053]: pgmap v104: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:46 compute-2 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec 01 09:54:46 compute-2 systemd[1]: Started Process Core Dump (PID 96408/UID 0).
Dec 01 09:54:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:46.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:46 compute-2 sudo[96560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acgzaraddkffnuhktrlorzllotsoxoap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582886.6308234-376-204204225298558/AnsiballZ_dnf.py'
Dec 01 09:54:46 compute-2 sudo[96560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:47 compute-2 python3.9[96562]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:47 compute-2 systemd-coredump[96409]: Process 84060 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 70:
                                                   #0  0x00007f98d405932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Dec 01 09:54:47 compute-2 systemd[1]: systemd-coredump@0-96408-0.service: Deactivated successfully.
Dec 01 09:54:47 compute-2 systemd[1]: systemd-coredump@0-96408-0.service: Consumed 1.348s CPU time.
Dec 01 09:54:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:47 compute-2 podman[96569]: 2025-12-01 09:54:47.908937523 +0000 UTC m=+0.025773342 container died 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Dec 01 09:54:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:47 compute-2 systemd[1]: var-lib-containers-storage-overlay-9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb-merged.mount: Deactivated successfully.
Dec 01 09:54:47 compute-2 podman[96569]: 2025-12-01 09:54:47.952144208 +0000 UTC m=+0.068979997 container remove 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec 01 09:54:47 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 09:54:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:48.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 09:54:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.256s CPU time.
Dec 01 09:54:48 compute-2 ceph-mon[76053]: pgmap v105: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:54:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:48.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:48 compute-2 sudo[96560]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:49 compute-2 sudo[96762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmuseoppdsdeamxhhwofdjiphaatgkgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582888.9707198-403-101554637060097/AnsiballZ_dnf.py'
Dec 01 09:54:49 compute-2 sudo[96762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:49 compute-2 python3.9[96764]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:54:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:50 compute-2 ceph-mon[76053]: pgmap v106: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:50.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:51 compute-2 sudo[96762]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:51 compute-2 sudo[96918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhakghdyisdrxyhsliehneksoshgyayf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582891.4756763-440-164134837243894/AnsiballZ_stat.py'
Dec 01 09:54:51 compute-2 sudo[96918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:51 compute-2 python3.9[96920]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:54:51 compute-2 sudo[96918]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095452 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:54:52 compute-2 ceph-mon[76053]: pgmap v107: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:54:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:52 compute-2 sudo[97072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gofmhdalbbquiprclldhogbsgvgxnudv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582892.1824355-464-157473398537359/AnsiballZ_slurp.py'
Dec 01 09:54:52 compute-2 sudo[97072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:54:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:52.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:52 compute-2 python3.9[97074]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec 01 09:54:52 compute-2 sudo[97072]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:54 compute-2 sshd-session[94281]: Connection closed by 192.168.122.30 port 56192
Dec 01 09:54:54 compute-2 sshd-session[94278]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:54:54 compute-2 systemd[1]: session-39.scope: Deactivated successfully.
Dec 01 09:54:54 compute-2 systemd[1]: session-39.scope: Consumed 18.923s CPU time.
Dec 01 09:54:54 compute-2 systemd-logind[795]: Session 39 logged out. Waiting for processes to exit.
Dec 01 09:54:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:54 compute-2 systemd-logind[795]: Removed session 39.
Dec 01 09:54:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:54.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:54 compute-2 ceph-mon[76053]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:54:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:54:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:54.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:56.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:56 compute-2 ceph-mon[76053]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:54:56 compute-2 sudo[97103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:54:56 compute-2 sudo[97103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:54:56 compute-2 sudo[97103]: pam_unix(sudo:session): session closed for user root
Dec 01 09:54:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:56.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:54:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:54:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:58.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:54:58 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 1.
Dec 01 09:54:58 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:54:58 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.256s CPU time.
Dec 01 09:54:58 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:54:58 compute-2 podman[97176]: 2025-12-01 09:54:58.545315711 +0000 UTC m=+0.043342829 container create 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:54:58 compute-2 ceph-mon[76053]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:54:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:54:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:54:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:54:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:54:58 compute-2 podman[97176]: 2025-12-01 09:54:58.600614655 +0000 UTC m=+0.098641803 container init 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec 01 09:54:58 compute-2 podman[97176]: 2025-12-01 09:54:58.60605246 +0000 UTC m=+0.104079578 container start 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:54:58 compute-2 bash[97176]: 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1
Dec 01 09:54:58 compute-2 podman[97176]: 2025-12-01 09:54:58.529875207 +0000 UTC m=+0.027902365 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:54:58 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:54:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:54:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:54:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:58.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:59 compute-2 sshd-session[97234]: Accepted publickey for zuul from 192.168.122.30 port 45828 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:54:59 compute-2 systemd-logind[795]: New session 40 of user zuul.
Dec 01 09:54:59 compute-2 systemd[1]: Started Session 40 of User zuul.
Dec 01 09:54:59 compute-2 sshd-session[97234]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:54:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:54:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:00.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:00 compute-2 python3.9[97387]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:55:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:00.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:00 compute-2 ceph-mon[76053]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:55:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:01 compute-2 python3.9[97543]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:55:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:02.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:02.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:02 compute-2 ceph-mon[76053]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:55:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:03 compute-2 python3.9[97737]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:55:03 compute-2 sshd-session[97237]: Connection closed by 192.168.122.30 port 45828
Dec 01 09:55:03 compute-2 sshd-session[97234]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:55:03 compute-2 systemd[1]: session-40.scope: Deactivated successfully.
Dec 01 09:55:03 compute-2 systemd[1]: session-40.scope: Consumed 2.277s CPU time.
Dec 01 09:55:03 compute-2 systemd-logind[795]: Session 40 logged out. Waiting for processes to exit.
Dec 01 09:55:03 compute-2 systemd-logind[795]: Removed session 40.
Dec 01 09:55:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:04.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:04 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:55:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:04 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:55:04 compute-2 ceph-mon[76053]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:55:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:06.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:06.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:06 compute-2 ceph-mon[76053]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:55:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:08.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:08 compute-2 ceph-mon[76053]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 09:55:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:08.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:10.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:10 compute-2 ceph-mon[76053]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 09:55:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:55:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:10.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:11 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d68000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:11 compute-2 sshd-session[97788]: Accepted publickey for zuul from 192.168.122.30 port 42826 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:55:11 compute-2 systemd-logind[795]: New session 41 of user zuul.
Dec 01 09:55:11 compute-2 systemd[1]: Started Session 41 of User zuul.
Dec 01 09:55:11 compute-2 sshd-session[97788]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:55:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:11 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:12.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:12 compute-2 python3.9[97941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:55:12 compute-2 ceph-mon[76053]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:55:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:12 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d44000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:12.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:13 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:13 compute-2 python3.9[98096]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:55:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:13 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d60001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:14.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:14 compute-2 sudo[98251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmlwmiemkyeozsuhrzotejuyupqxreji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582913.9000094-82-4380868947479/AnsiballZ_setup.py'
Dec 01 09:55:14 compute-2 sudo[98251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:14 compute-2 ceph-mon[76053]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:55:14 compute-2 python3.9[98253]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:55:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095514 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:55:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:14 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:14.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:14 compute-2 sudo[98251]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:15 compute-2 sudo[98336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inqidhonknilxmbldwbxpbsnkstsqgco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582913.9000094-82-4380868947479/AnsiballZ_dnf.py'
Dec 01 09:55:15 compute-2 sudo[98336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:15 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:15 compute-2 python3.9[98338]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:55:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:15 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:16.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:16 compute-2 ceph-mon[76053]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:55:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:16 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:16.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:16 compute-2 sudo[98341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:55:16 compute-2 sudo[98341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:16 compute-2 sudo[98341]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:16 compute-2 sudo[98336]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:17 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:17 compute-2 sudo[98517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukgjawgksfvhdciigfaqficbbkiioxyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582916.9911995-118-100296798285461/AnsiballZ_setup.py'
Dec 01 09:55:17 compute-2 sudo[98517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:17 compute-2 python3.9[98519]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:55:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:17 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:17 compute-2 sudo[98517]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:18.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:18 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:18 compute-2 ceph-mon[76053]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:55:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:18.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:18 compute-2 sudo[98712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exavqesrggxwqtcxeejslmcjhufnpxue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582918.3568249-151-81382928042301/AnsiballZ_file.py'
Dec 01 09:55:18 compute-2 sudo[98712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:18 compute-2 python3.9[98714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:18 compute-2 sudo[98712]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:19 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:19 compute-2 sudo[98866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oapfvmcriwbkfueoiupsvnphdnmeoawn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582919.3027103-175-84033816354949/AnsiballZ_command.py'
Dec 01 09:55:19 compute-2 sudo[98866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:19 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:19 compute-2 python3.9[98868]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:55:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:19 compute-2 sudo[98866]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:20.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:20 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:20.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:21 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:21 compute-2 ceph-mon[76053]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:55:21 compute-2 sudo[99033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihwaztyicdojmqltaglevbrhmjwugqqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582920.3561282-199-203000175648520/AnsiballZ_stat.py'
Dec 01 09:55:21 compute-2 sudo[99033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:21 compute-2 python3.9[99035]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:55:21 compute-2 sudo[99033]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:21 compute-2 sudo[99111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqoexqayhfoowlvhvanijdojndtyumfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582920.3561282-199-203000175648520/AnsiballZ_file.py'
Dec 01 09:55:21 compute-2 sudo[99111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:21 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:21 compute-2 python3.9[99113]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:21 compute-2 sudo[99111]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:22.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:22 compute-2 ceph-mon[76053]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:55:22 compute-2 sudo[99263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmwqhduleokvwrxpzeeqehzvkbzjryrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582922.0747356-235-272538580416690/AnsiballZ_stat.py'
Dec 01 09:55:22 compute-2 sudo[99263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:22 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:22 compute-2 python3.9[99265]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:55:22 compute-2 sudo[99263]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:22.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:22 compute-2 sudo[99341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckcrqzorbyysyhefxawnaslrfdfkuefd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582922.0747356-235-272538580416690/AnsiballZ_file.py'
Dec 01 09:55:22 compute-2 sudo[99341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:22 compute-2 python3.9[99343]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:55:22 compute-2 sudo[99341]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:23 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:23 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:23 compute-2 sudo[99495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsiuyrhlpkqpbdzgsckuwqittdnsbdqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582923.3525932-274-223053367720778/AnsiballZ_ini_file.py'
Dec 01 09:55:23 compute-2 sudo[99495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:23 compute-2 python3.9[99497]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:55:24 compute-2 sudo[99495]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:24.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:24 compute-2 ceph-mon[76053]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:55:24 compute-2 sudo[99648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fswawdpmsczvjwntwhexorpinpnnjqox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582924.1410637-274-107132372588742/AnsiballZ_ini_file.py'
Dec 01 09:55:24 compute-2 sudo[99648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:24 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:24 compute-2 python3.9[99650]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:55:24 compute-2 sudo[99648]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:24.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:24 compute-2 sudo[99801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdosefbfgrgzspvgqttgnoaqkuckjssz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582924.7247539-274-91010721871370/AnsiballZ_ini_file.py'
Dec 01 09:55:24 compute-2 sudo[99801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:25 compute-2 python3.9[99803]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:55:25 compute-2 sudo[99801]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:25 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:55:25 compute-2 sudo[99954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqvytzpzxptkvvdaepzfjvymfdgbakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582925.28521-274-15873021196172/AnsiballZ_ini_file.py'
Dec 01 09:55:25 compute-2 sudo[99954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:25 compute-2 python3.9[99956]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:55:25 compute-2 sudo[99954]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:25 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:26.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:26 compute-2 ceph-mon[76053]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:55:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:26 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:26 compute-2 sudo[100106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqaroxgwaonedxijawlffepzgqplwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582926.434314-367-186452892438216/AnsiballZ_dnf.py'
Dec 01 09:55:26 compute-2 sudo[100106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:26.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:26 compute-2 python3.9[100108]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:55:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:27 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:27 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:28.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:28 compute-2 ceph-mon[76053]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:55:28 compute-2 sudo[100106]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:28 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:28.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:29 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c002ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:29 compute-2 sudo[100262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdbptbpxeqnfxgvuhjnodigixclsgkuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582928.957573-401-23293310572058/AnsiballZ_setup.py'
Dec 01 09:55:29 compute-2 sudo[100262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:29 compute-2 python3.9[100264]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:55:29 compute-2 sudo[100262]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:29 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:30.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:30 compute-2 sudo[100417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whitimeubcclyfvtfhpcpatxoimngphv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582929.9916015-424-94537228671442/AnsiballZ_stat.py'
Dec 01 09:55:30 compute-2 sudo[100417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:30 compute-2 python3.9[100419]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:55:30 compute-2 ceph-mon[76053]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:55:30 compute-2 sudo[100417]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:30 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:30.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:31 compute-2 sudo[100570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-silyycmewjsamxutmqvseqoggjnffxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582930.7611945-451-269391531059451/AnsiballZ_stat.py'
Dec 01 09:55:31 compute-2 sudo[100570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:31 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:31 compute-2 python3.9[100572]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:55:31 compute-2 sudo[100570]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:31 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0034c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:31 compute-2 sudo[100723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfepfskczsyrpytzslagjocurqynnwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582931.627117-481-94386031124282/AnsiballZ_command.py'
Dec 01 09:55:31 compute-2 sudo[100723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:32 compute-2 python3.9[100725]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:55:32 compute-2 sudo[100723]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:32.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:32 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:32 compute-2 ceph-mon[76053]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:55:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:32 compute-2 sudo[100877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btyoqyveequdsebgsptcgoobsuoeqzbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582932.4676197-511-1556081501977/AnsiballZ_service_facts.py'
Dec 01 09:55:32 compute-2 sudo[100877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:33 compute-2 python3.9[100879]: ansible-service_facts Invoked
Dec 01 09:55:33 compute-2 network[100896]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:55:33 compute-2 network[100897]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:55:33 compute-2 network[100898]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:55:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:33 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:33 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:34 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0034c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:34 compute-2 ceph-mon[76053]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:55:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:34.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:35 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095535 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:55:35 compute-2 sudo[100924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:55:35 compute-2 sudo[100924]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:35 compute-2 sudo[100924]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:35 compute-2 sudo[100952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:55:35 compute-2 sudo[100952]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:35 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:55:35 compute-2 sudo[100952]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:36 compute-2 kernel: ganesha.nfsd[97775]: segfault at 50 ip 00007f2e0f50e32e sp 00007f2dd97f9210 error 4 in libntirpc.so.5.8[7f2e0f4f3000+2c000] likely on CPU 5 (core 0, socket 5)
Dec 01 09:55:36 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 09:55:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:36 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy ignored for local
Dec 01 09:55:36 compute-2 ceph-mon[76053]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:55:36 compute-2 systemd[1]: Started Process Core Dump (PID 101079/UID 0).
Dec 01 09:55:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:36 compute-2 sudo[100877]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:36 compute-2 sudo[101095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:55:36 compute-2 sudo[101095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:36 compute-2 sudo[101095]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:38 compute-2 systemd-coredump[101080]: Process 97194 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007f2e0f50e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 09:55:38 compute-2 systemd[1]: systemd-coredump@1-101079-0.service: Deactivated successfully.
Dec 01 09:55:38 compute-2 systemd[1]: systemd-coredump@1-101079-0.service: Consumed 1.545s CPU time.
Dec 01 09:55:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:38 compute-2 podman[101150]: 2025-12-01 09:55:38.15899905 +0000 UTC m=+0.028984032 container died 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:55:38 compute-2 systemd[1]: var-lib-containers-storage-overlay-7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d-merged.mount: Deactivated successfully.
Dec 01 09:55:38 compute-2 systemd[80431]: Created slice User Background Tasks Slice.
Dec 01 09:55:38 compute-2 systemd[80431]: Starting Cleanup of User's Temporary Files and Directories...
Dec 01 09:55:38 compute-2 podman[101150]: 2025-12-01 09:55:38.206693695 +0000 UTC m=+0.076678667 container remove 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 01 09:55:38 compute-2 systemd[80431]: Finished Cleanup of User's Temporary Files and Directories.
Dec 01 09:55:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 09:55:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 09:55:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.787s CPU time.
Dec 01 09:55:38 compute-2 ceph-mon[76053]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:55:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:39 compute-2 sudo[101342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itlfpmcrsahaqududwjiggsazimnqvom ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764582939.1179545-557-269776926400448/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764582939.1179545-557-269776926400448/args'
Dec 01 09:55:39 compute-2 sudo[101342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:39 compute-2 sudo[101342]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:55:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:40.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:40 compute-2 sudo[101509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekblvjyrcmopkcdpreuxjswzesrqhmlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582939.8239157-590-28707868843337/AnsiballZ_dnf.py'
Dec 01 09:55:40 compute-2 sudo[101509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:40 compute-2 python3.9[101511]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:55:40 compute-2 ceph-mon[76053]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:55:40 compute-2 ceph-mon[76053]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Dec 01 09:55:40 compute-2 ceph-mon[76053]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec 01 09:55:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:40.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:41 compute-2 sudo[101509]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095542 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:55:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:42 compute-2 ceph-mon[76053]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 204 B/s rd, 0 op/s
Dec 01 09:55:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:43 compute-2 sudo[101665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pychrxrhzttuvddsqndvhsjbeoiipkqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582942.5608544-629-252282778917687/AnsiballZ_package_facts.py'
Dec 01 09:55:43 compute-2 sudo[101665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:43 compute-2 python3.9[101667]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 01 09:55:43 compute-2 sudo[101665]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:55:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:44.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:55:44 compute-2 sudo[101693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:55:44 compute-2 sudo[101693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:44 compute-2 sudo[101693]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:44 compute-2 ceph-mon[76053]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 204 B/s rd, 0 op/s
Dec 01 09:55:44 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:44 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:44 compute-2 sudo[101843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcthfniurpvfzhqkuokozydiuvkgudhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582944.4997418-660-22462227711471/AnsiballZ_stat.py'
Dec 01 09:55:44 compute-2 sudo[101843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:44 compute-2 python3.9[101845]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:55:45 compute-2 sudo[101843]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:45 compute-2 sudo[101923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eawuwexpvqcoavsmammyppnkmlklzfly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582944.4997418-660-22462227711471/AnsiballZ_file.py'
Dec 01 09:55:45 compute-2 sudo[101923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:45 compute-2 python3.9[101925]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:45 compute-2 sudo[101923]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:46.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:46 compute-2 sudo[102075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vchufkyvrrshuczojofhvbjllzyvwbrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582946.0060604-696-219924105749720/AnsiballZ_stat.py'
Dec 01 09:55:46 compute-2 sudo[102075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:46 compute-2 python3.9[102077]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:55:46 compute-2 sudo[102075]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec 01 09:55:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:46.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec 01 09:55:46 compute-2 sudo[102153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccpezidxxardvmneerzweioaxylkkfes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582946.0060604-696-219924105749720/AnsiballZ_file.py'
Dec 01 09:55:46 compute-2 sudo[102153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:46 compute-2 ceph-mon[76053]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 204 B/s rd, 0 op/s
Dec 01 09:55:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:46 compute-2 python3.9[102155]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:46 compute-2 sudo[102153]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:48.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:48 compute-2 sudo[102307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvxtuqejwugelbklwldkxbdvsojwbxew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582947.991636-751-52364791314255/AnsiballZ_lineinfile.py'
Dec 01 09:55:48 compute-2 sudo[102307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:48 compute-2 python3.9[102309]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:48 compute-2 sudo[102307]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 2.
Dec 01 09:55:48 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:55:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.787s CPU time.
Dec 01 09:55:48 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:55:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:48.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:48 compute-2 podman[102382]: 2025-12-01 09:55:48.858392621 +0000 UTC m=+0.114225853 container create 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:55:48 compute-2 ceph-mon[76053]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Dec 01 09:55:48 compute-2 podman[102382]: 2025-12-01 09:55:48.768459062 +0000 UTC m=+0.024292314 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:55:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:55:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:55:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:55:48 compute-2 podman[102382]: 2025-12-01 09:55:48.932083254 +0000 UTC m=+0.187916486 container init 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 09:55:48 compute-2 podman[102382]: 2025-12-01 09:55:48.936794722 +0000 UTC m=+0.192627954 container start 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:48 compute-2 bash[102382]: 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228
Dec 01 09:55:48 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:55:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:55:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:55:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:55:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:49 compute-2 sudo[102566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmlvzuxhfpxwuxkeblppgolyeterfhid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582949.6320019-794-152215388688805/AnsiballZ_setup.py'
Dec 01 09:55:49 compute-2 sudo[102566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:50.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:50 compute-2 python3.9[102568]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:55:50 compute-2 ceph-mon[76053]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Dec 01 09:55:50 compute-2 sudo[102566]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:50.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:50 compute-2 sudo[102651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtvalawmhujmukvsxknzrgoqzlnjovxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582949.6320019-794-152215388688805/AnsiballZ_systemd.py'
Dec 01 09:55:50 compute-2 sudo[102651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:51 compute-2 python3.9[102653]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:55:51 compute-2 sudo[102651]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:52.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:52 compute-2 sshd-session[97791]: Connection closed by 192.168.122.30 port 42826
Dec 01 09:55:52 compute-2 sshd-session[97788]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:55:52 compute-2 systemd[1]: session-41.scope: Deactivated successfully.
Dec 01 09:55:52 compute-2 systemd[1]: session-41.scope: Consumed 23.529s CPU time.
Dec 01 09:55:52 compute-2 systemd-logind[795]: Session 41 logged out. Waiting for processes to exit.
Dec 01 09:55:52 compute-2 systemd-logind[795]: Removed session 41.
Dec 01 09:55:52 compute-2 ceph-mon[76053]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 09:55:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:52.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:54.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:54 compute-2 ceph-mon[76053]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 09:55:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:54.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 01 09:55:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:55:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:56 compute-2 ceph-mon[76053]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 09:55:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:56 compute-2 sudo[102686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:55:56 compute-2 sudo[102686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:55:56 compute-2 sudo[102686]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095557 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:55:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:55:57 compute-2 sshd-session[102712]: Accepted publickey for zuul from 192.168.122.30 port 35210 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:55:57 compute-2 systemd-logind[795]: New session 42 of user zuul.
Dec 01 09:55:57 compute-2 systemd[1]: Started Session 42 of User zuul.
Dec 01 09:55:57 compute-2 sshd-session[102712]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:55:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:55:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:58.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:55:58 compute-2 sudo[102865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkrdnfyrprsbggpkeiclwliiaaoqhzgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582957.8030622-28-3591872675534/AnsiballZ_file.py'
Dec 01 09:55:58 compute-2 sudo[102865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:58 compute-2 python3.9[102867]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:58 compute-2 sudo[102865]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:58 compute-2 ceph-mon[76053]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 01 09:55:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:55:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:55:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:58.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:55:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:59 compute-2 sudo[103018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaxuidowfjsuvkyhanyzxylhoaljpahl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582958.746728-65-165981768744530/AnsiballZ_stat.py'
Dec 01 09:55:59 compute-2 sudo[103018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:59 compute-2 python3.9[103020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:55:59 compute-2 sudo[103018]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:59 compute-2 sudo[103097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkhysodictypritqnfysngjqsngsbwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582958.746728-65-165981768744530/AnsiballZ_file.py'
Dec 01 09:55:59 compute-2 sudo[103097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:55:59 compute-2 python3.9[103099]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:55:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:55:59 compute-2 sudo[103097]: pam_unix(sudo:session): session closed for user root
Dec 01 09:55:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec 01 09:56:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:00.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec 01 09:56:00 compute-2 sshd-session[102715]: Connection closed by 192.168.122.30 port 35210
Dec 01 09:56:00 compute-2 sshd-session[102712]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:56:00 compute-2 systemd[1]: session-42.scope: Deactivated successfully.
Dec 01 09:56:00 compute-2 systemd[1]: session-42.scope: Consumed 1.587s CPU time.
Dec 01 09:56:00 compute-2 systemd-logind[795]: Session 42 logged out. Waiting for processes to exit.
Dec 01 09:56:00 compute-2 systemd-logind[795]: Removed session 42.
Dec 01 09:56:00 compute-2 ceph-mon[76053]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 09:56:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:00.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000008:nfs.cephfs.1: -2
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:02.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:02 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:02 compute-2 ceph-mon[76053]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 01 09:56:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:02.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:04.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095604 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:56:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:04 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:04 compute-2 ceph-mon[76053]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:56:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:04.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:06 compute-2 sshd-session[103146]: Accepted publickey for zuul from 192.168.122.30 port 35218 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:56:06 compute-2 systemd-logind[795]: New session 43 of user zuul.
Dec 01 09:56:06 compute-2 systemd[1]: Started Session 43 of User zuul.
Dec 01 09:56:06 compute-2 sshd-session[103146]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:56:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:06.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:06 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:06.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:06 compute-2 ceph-mon[76053]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:56:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:07 compute-2 python3.9[103300]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:56:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:08 compute-2 sudo[103455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppjxaydhqwicxzsqhhkomypyzaxujwqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582967.6993728-61-187042796093774/AnsiballZ_file.py'
Dec 01 09:56:08 compute-2 sudo[103455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:08.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:08 compute-2 python3.9[103457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:08 compute-2 sudo[103455]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:08 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000019s ======
Dec 01 09:56:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:08.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Dec 01 09:56:08 compute-2 ceph-mon[76053]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:56:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:09 compute-2 sudo[103631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egcidebaprlibfcjaispnapltwslvlao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582968.592153-86-108801180664015/AnsiballZ_stat.py'
Dec 01 09:56:09 compute-2 sudo[103631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:09 compute-2 python3.9[103633]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:09 compute-2 sudo[103631]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:09 compute-2 sudo[103710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpejkjqmnnkcmvdjsimudzfbvjoaaekf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582968.592153-86-108801180664015/AnsiballZ_file.py'
Dec 01 09:56:09 compute-2 sudo[103710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:56:09 compute-2 python3.9[103712]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.kucffxif recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:09 compute-2 sudo[103710]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:10 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:10 compute-2 sudo[103862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anvkjevhkhbkbyliarooyakntqktlcck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582970.3994122-145-4772847744940/AnsiballZ_stat.py'
Dec 01 09:56:10 compute-2 sudo[103862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:10.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:10 compute-2 ceph-mon[76053]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Dec 01 09:56:10 compute-2 python3.9[103864]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:10 compute-2 sudo[103862]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:11 compute-2 sudo[103941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaxdkedtsaagphobelkipnqjwmdyquve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582970.3994122-145-4772847744940/AnsiballZ_file.py'
Dec 01 09:56:11 compute-2 sudo[103941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:11 compute-2 python3.9[103943]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.v7954hin recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:11 compute-2 sudo[103941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:12 compute-2 sudo[104094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnzawxrwlpwondntxriauivhwqwoekqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582971.8084028-184-103855669249301/AnsiballZ_file.py'
Dec 01 09:56:12 compute-2 sudo[104094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:12.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:12 compute-2 python3.9[104096]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:56:12 compute-2 sudo[104094]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:12 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:12.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:12 compute-2 ceph-mon[76053]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Dec 01 09:56:12 compute-2 sudo[104247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goopgngsgoofryauzfpfuweprtbgpuyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582972.5844274-208-221248060712412/AnsiballZ_stat.py'
Dec 01 09:56:12 compute-2 sudo[104247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:13 compute-2 python3.9[104249]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:13 compute-2 sudo[104247]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:13 compute-2 sudo[104327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubtxxmhiulyatwtszexnmgyxjxhzflvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582972.5844274-208-221248060712412/AnsiballZ_file.py'
Dec 01 09:56:13 compute-2 sudo[104327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:13 compute-2 python3.9[104329]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:56:13 compute-2 sudo[104327]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:14 compute-2 sudo[104479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcvnyscdyvayhmhsacpzcootwimvwqzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582973.7766318-208-238909792923329/AnsiballZ_stat.py'
Dec 01 09:56:14 compute-2 sudo[104479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:14.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:14 compute-2 python3.9[104481]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:14 compute-2 sudo[104479]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:14 compute-2 sudo[104557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwofvipqaowkvwdfrujoffpqplerjre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582973.7766318-208-238909792923329/AnsiballZ_file.py'
Dec 01 09:56:14 compute-2 sudo[104557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:14 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:14 compute-2 python3.9[104559]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:56:14 compute-2 sudo[104557]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000019s ======
Dec 01 09:56:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Dec 01 09:56:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:15 compute-2 ceph-mon[76053]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:56:15 compute-2 sudo[104711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqfadhwtntcsvyxnvleknqvuktgsnrjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582975.3154004-278-66192143058373/AnsiballZ_file.py'
Dec 01 09:56:15 compute-2 sudo[104711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:15 compute-2 python3.9[104713]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:15 compute-2 sudo[104711]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:16.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:16 compute-2 ceph-mon[76053]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:56:16 compute-2 sudo[104863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzzxexhpxxzitppzhpjcqjzdqlahvlhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582976.0386088-301-41991182167425/AnsiballZ_stat.py'
Dec 01 09:56:16 compute-2 sudo[104863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:16 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:16 compute-2 python3.9[104865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:16 compute-2 sudo[104863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:16.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:16 compute-2 sudo[104941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tefmbgaygcxxgynmjvqgiamuxhfreurn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582976.0386088-301-41991182167425/AnsiballZ_file.py'
Dec 01 09:56:16 compute-2 sudo[104941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:17 compute-2 sudo[104945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:56:17 compute-2 sudo[104945]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:17 compute-2 sudo[104945]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:17 compute-2 python3.9[104943]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:17 compute-2 sudo[104941]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:17 compute-2 sudo[105120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjqllywcfldlnueeistwrvqcrfyqpheb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582977.3689427-338-177515194621210/AnsiballZ_stat.py'
Dec 01 09:56:17 compute-2 sudo[105120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:17 compute-2 python3.9[105122]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:17 compute-2 sudo[105120]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:18 compute-2 sudo[105198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuurbzfgcsmrfswbtibbbfrfomgvbzbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582977.3689427-338-177515194621210/AnsiballZ_file.py'
Dec 01 09:56:18 compute-2 sudo[105198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:18.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:18 compute-2 python3.9[105200]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:18 compute-2 sudo[105198]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:18 compute-2 ceph-mon[76053]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:56:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:18 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:19 compute-2 sudo[105351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftjsxfqxmthaiwxjbvcfqvyzdjvfmnxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582978.5509214-373-273303304916184/AnsiballZ_systemd.py'
Dec 01 09:56:19 compute-2 sudo[105351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:19 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:19 compute-2 python3.9[105353]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:56:19 compute-2 systemd[1]: Reloading.
Dec 01 09:56:19 compute-2 systemd-rc-local-generator[105382]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:56:19 compute-2 systemd-sysv-generator[105385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:56:19 compute-2 sudo[105351]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:19 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095619 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:56:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:20.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:20 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:20 compute-2 ceph-mon[76053]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:56:20 compute-2 sudo[105542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocepvrmablvzsqisctfclxscresjmloe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582980.1579335-398-168204383402703/AnsiballZ_stat.py'
Dec 01 09:56:20 compute-2 sudo[105542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:20.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:20 compute-2 python3.9[105544]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:20 compute-2 sudo[105542]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:21 compute-2 sudo[105621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woyqosvqapmpdferkvqmknmrwaljknsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582980.1579335-398-168204383402703/AnsiballZ_file.py'
Dec 01 09:56:21 compute-2 sudo[105621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:21 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:21 compute-2 python3.9[105623]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:21 compute-2 sudo[105621]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:21 compute-2 sudo[105774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yusspafyqrgvjmkxngzmiyjfunmrsfsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582981.4909358-434-240834857293368/AnsiballZ_stat.py'
Dec 01 09:56:21 compute-2 sudo[105774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:21 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:21 compute-2 python3.9[105776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:22 compute-2 sudo[105774]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:22.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:22 compute-2 sudo[105852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejyarmidyzkoxmiqmdjamcvxrslfnvet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582981.4909358-434-240834857293368/AnsiballZ_file.py'
Dec 01 09:56:22 compute-2 sudo[105852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:22 compute-2 python3.9[105854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:22 compute-2 sudo[105852]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:22 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:22 compute-2 ceph-mon[76053]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:56:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:22.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:23 compute-2 sudo[106005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgknnhkjyasbwbgipdgjdaoqttykxxbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582982.8524988-469-115562593941118/AnsiballZ_systemd.py'
Dec 01 09:56:23 compute-2 sudo[106005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:23 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:23 compute-2 python3.9[106007]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:56:23 compute-2 systemd[1]: Reloading.
Dec 01 09:56:23 compute-2 systemd-rc-local-generator[106037]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:56:23 compute-2 systemd-sysv-generator[106041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:56:23 compute-2 sshd-session[104276]: error: kex_exchange_identification: read: Connection timed out
Dec 01 09:56:23 compute-2 sshd-session[104276]: banner exchange: Connection from 14.22.89.30 port 51028: Connection timed out
Dec 01 09:56:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:23 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:23 compute-2 systemd[1]: Starting Create netns directory...
Dec 01 09:56:23 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:56:23 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:56:23 compute-2 systemd[1]: Finished Create netns directory.
Dec 01 09:56:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:23 compute-2 sudo[106005]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:24.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:24 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:24 compute-2 ceph-mon[76053]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:56:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:56:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:24.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:24 compute-2 python3.9[106201]: ansible-ansible.builtin.service_facts Invoked
Dec 01 09:56:24 compute-2 network[106219]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 09:56:24 compute-2 network[106220]: 'network-scripts' will be removed from distribution in near future.
Dec 01 09:56:24 compute-2 network[106221]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 09:56:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:25 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:25 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:26 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:26.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:26 compute-2 ceph-mon[76053]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:56:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:27 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:27 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:28.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:28 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:56:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:28 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:28.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:28 compute-2 ceph-mon[76053]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:56:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:29 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:29 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:30.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:30 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:30.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:30 compute-2 ceph-mon[76053]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:56:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:31 compute-2 sudo[106490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyfrdalmguyszilhkkjfguibfqzvxjpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582991.690756-548-151254266164153/AnsiballZ_stat.py'
Dec 01 09:56:31 compute-2 sudo[106490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:32 compute-2 ceph-mon[76053]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:56:32 compute-2 python3.9[106492]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:32 compute-2 sudo[106490]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:32.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:32 compute-2 sudo[106568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djhngpsdxfsdqwactzfuqxtmhcipmiyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582991.690756-548-151254266164153/AnsiballZ_file.py'
Dec 01 09:56:32 compute-2 sudo[106568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:32 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:32 compute-2 python3.9[106570]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:32 compute-2 sudo[106568]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:32.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:33 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:33 compute-2 sudo[106723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oimtkbmyqkoeqrwkjyeektqditwqqlbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582993.101015-587-158178347077259/AnsiballZ_file.py'
Dec 01 09:56:33 compute-2 sudo[106723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:33 compute-2 python3.9[106725]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:33 compute-2 sudo[106723]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:33 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:34.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:34 compute-2 sudo[106875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geoksasqfqwozehthjdzixtrathonjkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582993.9406793-610-158271761301315/AnsiballZ_stat.py'
Dec 01 09:56:34 compute-2 sudo[106875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:34 compute-2 python3.9[106877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:34 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:56:34 compute-2 ceph-mon[76053]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:56:34 compute-2 sudo[106875]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:34 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:34 compute-2 sudo[106953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwfhmkzvnyqidhbckqvqniwhjmxqqgpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582993.9406793-610-158271761301315/AnsiballZ_file.py'
Dec 01 09:56:34 compute-2 sudo[106953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:34.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:34 compute-2 python3.9[106955]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:34 compute-2 sudo[106953]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:35 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:35 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:35 compute-2 sudo[107107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yajdoucqepnebepfiyhrhgncypbabkfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582995.5072396-656-42386508156541/AnsiballZ_timezone.py'
Dec 01 09:56:35 compute-2 sudo[107107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:36 compute-2 python3.9[107109]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 01 09:56:36 compute-2 systemd[1]: Starting Time & Date Service...
Dec 01 09:56:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:36.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:36 compute-2 systemd[1]: Started Time & Date Service.
Dec 01 09:56:36 compute-2 sudo[107107]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:36 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:36 compute-2 ceph-mon[76053]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:56:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:36.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:36 compute-2 sudo[107264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbrmapiskjuvqcvqvnhpebfeemcszewo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582996.7241163-683-2026769817957/AnsiballZ_file.py'
Dec 01 09:56:36 compute-2 sudo[107264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:37 compute-2 sudo[107267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:56:37 compute-2 sudo[107267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:37 compute-2 sudo[107267]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:37 compute-2 python3.9[107266]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:37 compute-2 sudo[107264]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:37 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:37 compute-2 sshd-session[106228]: Received disconnect from 45.78.219.119 port 35022:11: Bye Bye [preauth]
Dec 01 09:56:37 compute-2 sshd-session[106228]: Disconnected from authenticating user root 45.78.219.119 port 35022 [preauth]
Dec 01 09:56:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:37 compute-2 sudo[107442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsrcbcvpitrwodqjefcbzyvrztxzgnix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582997.5308254-707-79671580670055/AnsiballZ_stat.py'
Dec 01 09:56:37 compute-2 sudo[107442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:37 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:37 compute-2 python3.9[107444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:38 compute-2 sudo[107442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:38 compute-2 sudo[107520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzqvourduhdqagxjqsmvljhaznnmoriv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582997.5308254-707-79671580670055/AnsiballZ_file.py'
Dec 01 09:56:38 compute-2 sudo[107520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:38 compute-2 python3.9[107522]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:38 compute-2 sudo[107520]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:38 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:38 compute-2 ceph-mon[76053]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:56:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec 01 09:56:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec 01 09:56:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:38 compute-2 sudo[107673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzwwvhgngtczjcrvidwpdfowdhzdmzhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582998.6742778-743-32283661465582/AnsiballZ_stat.py'
Dec 01 09:56:38 compute-2 sudo[107673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.016062) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999016283, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1334, "num_deletes": 250, "total_data_size": 3378186, "memory_usage": 3437592, "flush_reason": "Manual Compaction"}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999026904, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1331906, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10717, "largest_seqno": 12046, "table_properties": {"data_size": 1327527, "index_size": 1903, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11125, "raw_average_key_size": 20, "raw_value_size": 1318060, "raw_average_value_size": 2370, "num_data_blocks": 84, "num_entries": 556, "num_filter_entries": 556, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582880, "oldest_key_time": 1764582880, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 10885 microseconds, and 4529 cpu microseconds.
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.026974) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1331906 bytes OK
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027000) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028642) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028659) EVENT_LOG_v1 {"time_micros": 1764582999028654, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028685) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3371958, prev total WAL file size 3371958, number of live WAL files 2.
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.029693) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1300KB)], [21(14MB)]
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999029813, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16141988, "oldest_snapshot_seqno": -1}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4224 keys, 13855416 bytes, temperature: kUnknown
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999121199, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13855416, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13823594, "index_size": 20192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10565, "raw_key_size": 107605, "raw_average_key_size": 25, "raw_value_size": 13742684, "raw_average_value_size": 3253, "num_data_blocks": 863, "num_entries": 4224, "num_filter_entries": 4224, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.121585) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13855416 bytes
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.123453) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.3 rd, 151.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.1 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(22.5) write-amplify(10.4) OK, records in: 4686, records dropped: 462 output_compression: NoCompression
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.123477) EVENT_LOG_v1 {"time_micros": 1764582999123464, "job": 10, "event": "compaction_finished", "compaction_time_micros": 91567, "compaction_time_cpu_micros": 38797, "output_level": 6, "num_output_files": 1, "total_output_size": 13855416, "num_input_records": 4686, "num_output_records": 4224, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999124169, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999126633, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.029547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:56:39 compute-2 python3.9[107675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:39 compute-2 sudo[107673]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:39 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:39 compute-2 sudo[107752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfhxhlgrgldmzqprdzjkhthrzlobjdeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582998.6742778-743-32283661465582/AnsiballZ_file.py'
Dec 01 09:56:39 compute-2 sudo[107752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:39 compute-2 python3.9[107754]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fmvl_raz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:39 compute-2 sudo[107752]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:39 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095639 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:56:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:40 compute-2 ceph-mon[76053]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:56:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:56:40 compute-2 sudo[107904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udfmxnaeewtdcdkqbinxubgdbktlwkpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582999.8482616-779-8134802383091/AnsiballZ_stat.py'
Dec 01 09:56:40 compute-2 sudo[107904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:40.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:40 compute-2 python3.9[107906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:40 compute-2 sudo[107904]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:40 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:40 compute-2 sudo[107982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjbnijevivkzxuspumjvomdssezkpcwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764582999.8482616-779-8134802383091/AnsiballZ_file.py'
Dec 01 09:56:40 compute-2 sudo[107982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:40 compute-2 python3.9[107984]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:40 compute-2 sudo[107982]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:41 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:41 compute-2 sudo[108136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paazrpzlhtrqcsjscangcpwalxhqcizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583001.2268312-818-57047781629853/AnsiballZ_command.py'
Dec 01 09:56:41 compute-2 sudo[108136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:41 compute-2 python3.9[108138]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:56:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:41 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:41 compute-2 sudo[108136]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:42.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:42 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:42 compute-2 sudo[108289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tomydjabgwmepucxcfxfkbeoisutszub ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583002.1550474-842-93183370727080/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:56:42 compute-2 sudo[108289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:42 compute-2 python3[108292]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:56:42 compute-2 sudo[108289]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:42 compute-2 ceph-mon[76053]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 09:56:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:43 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:43 compute-2 sudo[108444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chbxfbhltqxlohsbbdzmlxjocukeaxoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583003.0510244-866-89465304199690/AnsiballZ_stat.py'
Dec 01 09:56:43 compute-2 sudo[108444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:43 compute-2 python3.9[108446]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:43 compute-2 sudo[108444]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:43 compute-2 sudo[108522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtalerjvmrajohhjuraxxabjzpvwhbvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583003.0510244-866-89465304199690/AnsiballZ_file.py'
Dec 01 09:56:43 compute-2 sudo[108522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:43 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:44 compute-2 python3.9[108524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:44 compute-2 sudo[108522]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:44.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:44 compute-2 ceph-mon[76053]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 09:56:44 compute-2 sudo[108601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:56:44 compute-2 sudo[108601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:44 compute-2 sudo[108601]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:44 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:44 compute-2 sudo[108639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:56:44 compute-2 sudo[108639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:44 compute-2 sudo[108724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padwuiehzlxkkyclayjycpqzjmddxxsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583004.3805358-903-281220796286585/AnsiballZ_stat.py'
Dec 01 09:56:44 compute-2 sudo[108724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:56:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:56:44 compute-2 python3.9[108726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:44 compute-2 sudo[108724]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:45 compute-2 sudo[108639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:45 compute-2 sudo[108834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuybgkhuhvmdufobyvqooxajkkjihryw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583004.3805358-903-281220796286585/AnsiballZ_file.py'
Dec 01 09:56:45 compute-2 sudo[108834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:45 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06140091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:45 compute-2 python3.9[108836]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:56:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:56:45 compute-2 sudo[108834]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:45 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:46 compute-2 sudo[108987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alafxnzqcmompcjosrbloskeppkzbftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583005.7213383-938-115643297718627/AnsiballZ_stat.py'
Dec 01 09:56:46 compute-2 sudo[108987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:46 compute-2 python3.9[108989]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:46.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:46 compute-2 sudo[108987]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:46 compute-2 ceph-mon[76053]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 431 B/s wr, 2 op/s
Dec 01 09:56:46 compute-2 sudo[109065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bybsounqeejguplovqdntarjllhuejsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583005.7213383-938-115643297718627/AnsiballZ_file.py'
Dec 01 09:56:46 compute-2 sudo[109065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:46 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:46 compute-2 python3.9[109067]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:46 compute-2 sudo[109065]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:56:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:56:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:47 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:47 compute-2 sudo[109219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tquaokvlaopmdvyedkilrkgoglvchxyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583007.2354176-974-279030304016747/AnsiballZ_stat.py'
Dec 01 09:56:47 compute-2 sudo[109219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:47 compute-2 python3.9[109221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:47 compute-2 sudo[109219]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:47 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06140091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:47 compute-2 sudo[109297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhygzyaqadmifyfxyliyrjgkizptwqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583007.2354176-974-279030304016747/AnsiballZ_file.py'
Dec 01 09:56:47 compute-2 sudo[109297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:48 compute-2 python3.9[109299]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:48 compute-2 sudo[109297]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:48.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:48 compute-2 ceph-mon[76053]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 431 B/s wr, 2 op/s
Dec 01 09:56:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:48 compute-2 sudo[109450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzlfpawhyihxbvixcscrrwgztpkxveks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583008.5336676-1010-245898668081573/AnsiballZ_stat.py'
Dec 01 09:56:48 compute-2 sudo[109450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:49 compute-2 python3.9[109452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:56:49 compute-2 sudo[109450]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:49 compute-2 sudo[109529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfylvssdkfbgoumofsqwvkxwnzumbyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583008.5336676-1010-245898668081573/AnsiballZ_file.py'
Dec 01 09:56:49 compute-2 sudo[109529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:49 compute-2 python3.9[109531]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:49 compute-2 sudo[109529]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:50.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:50 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:50 compute-2 ceph-mon[76053]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 258 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:56:50 compute-2 sudo[109681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwuuvbwwupvuwjbtuspcawuvrkfuzshp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583010.3337226-1049-133393205399262/AnsiballZ_command.py'
Dec 01 09:56:50 compute-2 sudo[109681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:50 compute-2 python3.9[109683]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:56:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:50.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:50 compute-2 sudo[109681]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:50 compute-2 sudo[109687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:56:50 compute-2 sudo[109687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:50 compute-2 sudo[109687]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:51 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:51 compute-2 sudo[109863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpxqrqylfnvpmayuhzqvctvyazoczzop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583011.1529455-1074-873413571462/AnsiballZ_blockinfile.py'
Dec 01 09:56:51 compute-2 sudo[109863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:56:51 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:56:51 compute-2 python3.9[109865]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:51 compute-2 sudo[109863]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:51 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:56:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:52.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:56:52 compute-2 sudo[110015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugsslbrmnjjyahpejcdoratlwpdviuyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583012.1952093-1099-79022671473706/AnsiballZ_file.py'
Dec 01 09:56:52 compute-2 sudo[110015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:52 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:52 compute-2 python3.9[110017]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:52 compute-2 sudo[110015]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:52 compute-2 ceph-mon[76053]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 345 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:56:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:52.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:53 compute-2 sudo[110168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtzynuudbgcqfjgyunggywzanggvnwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583012.795-1099-209033575466402/AnsiballZ_file.py'
Dec 01 09:56:53 compute-2 sudo[110168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:53 compute-2 python3.9[110170]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:56:53 compute-2 sudo[110168]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:53 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:53 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:54 compute-2 sudo[110321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngnwzfsldnhddtahqzcflcpsfityhvoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583013.630279-1145-135227603751762/AnsiballZ_mount.py'
Dec 01 09:56:54 compute-2 sudo[110321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:54.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:54 compute-2 python3.9[110323]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:56:54 compute-2 sudo[110321]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:54 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:54 compute-2 ceph-mon[76053]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 258 B/s rd, 0 op/s
Dec 01 09:56:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:56:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:54.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:54 compute-2 sudo[110474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thwtbgxgrrfpmbcyukopnreinssxfaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583014.5227306-1145-206812378404370/AnsiballZ_mount.py'
Dec 01 09:56:54 compute-2 sudo[110474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:56:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:55 compute-2 python3.9[110476]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 01 09:56:55 compute-2 sudo[110474]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:55 compute-2 sshd-session[103149]: Connection closed by 192.168.122.30 port 35218
Dec 01 09:56:55 compute-2 sshd-session[103146]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:56:55 compute-2 systemd[1]: session-43.scope: Deactivated successfully.
Dec 01 09:56:55 compute-2 systemd[1]: session-43.scope: Consumed 28.504s CPU time.
Dec 01 09:56:55 compute-2 systemd-logind[795]: Session 43 logged out. Waiting for processes to exit.
Dec 01 09:56:55 compute-2 systemd-logind[795]: Removed session 43.
Dec 01 09:56:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:56:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:56.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:56:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:56 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:56 compute-2 ceph-mon[76053]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 258 B/s rd, 0 op/s
Dec 01 09:56:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:56:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:56.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:56:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:57 compute-2 sudo[110503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:56:57 compute-2 sudo[110503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:56:57 compute-2 sudo[110503]: pam_unix(sudo:session): session closed for user root
Dec 01 09:56:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:57 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:56:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:57 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:56:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:58.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:56:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:58 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:58 compute-2 ceph-mon[76053]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:56:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:56:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:56:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:58.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:56:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:59 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:59 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:56:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:56:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:00.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:00 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:00 compute-2 ceph-mon[76053]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:00.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:00 compute-2 sshd-session[110531]: Accepted publickey for zuul from 192.168.122.30 port 51444 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:57:00 compute-2 systemd-logind[795]: New session 44 of user zuul.
Dec 01 09:57:00 compute-2 systemd[1]: Started Session 44 of User zuul.
Dec 01 09:57:00 compute-2 sshd-session[110531]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:57:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:01 compute-2 sudo[110686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycxsqqajdgkhakrtiuoklmrjalkbmddx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583020.998602-20-152125567218949/AnsiballZ_tempfile.py'
Dec 01 09:57:01 compute-2 sudo[110686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:01 compute-2 python3.9[110688]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 01 09:57:01 compute-2 sudo[110686]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:02.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:02 compute-2 sudo[110838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvkjuxdaqmupizdxkulndezoxxhxrtbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583021.987377-56-119558146651492/AnsiballZ_stat.py'
Dec 01 09:57:02 compute-2 sudo[110838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:02 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:02 compute-2 python3.9[110840]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:57:02 compute-2 sudo[110838]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:02 compute-2 ceph-mon[76053]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:57:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:03 compute-2 sudo[110994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whovaezywohkzyjuxllmmenpiwswycpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583022.9602122-81-140903931656319/AnsiballZ_slurp.py'
Dec 01 09:57:03 compute-2 sudo[110994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:03 compute-2 python3.9[110996]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 01 09:57:03 compute-2 sudo[110994]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:04.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:04 compute-2 sudo[111148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvvwqqzyocrgxgjzudcczhyrtglckiit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583024.0507872-104-62459907723138/AnsiballZ_stat.py'
Dec 01 09:57:04 compute-2 sudo[111148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:04 compute-2 python3.9[111150]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.wbvxs786 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:04 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:04 compute-2 sudo[111148]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:04.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:04 compute-2 ceph-mon[76053]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:05 compute-2 sudo[111274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipgxxwzdhagvigripslazuxykfrqqqia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583024.0507872-104-62459907723138/AnsiballZ_copy.py'
Dec 01 09:57:05 compute-2 sudo[111274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:05 compute-2 python3.9[111276]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.wbvxs786 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583024.0507872-104-62459907723138/.source.wbvxs786 _original_basename=.liurd8he follow=False checksum=8dc09b174cc5b8debe148224e7d00f23d70f4242 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:05 compute-2 sudo[111274]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:06 compute-2 ceph-mon[76053]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:06 compute-2 sudo[111427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xufisdzvauflvkzssidiqsyqktfflcly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583025.6043084-149-209956596285968/AnsiballZ_setup.py'
Dec 01 09:57:06 compute-2 sudo[111427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:06.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:06 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 09:57:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:06 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:06 compute-2 python3.9[111429]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:57:06 compute-2 sudo[111427]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:06.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:07 compute-2 sudo[111583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihymhytrrmndmzovykbjytdwdofddarm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583026.9999862-174-528413012708/AnsiballZ_blockinfile.py'
Dec 01 09:57:07 compute-2 sudo[111583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:07 compute-2 python3.9[111585]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=
                                              create=True mode=0644 path=/tmp/ansible.wbvxs786 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:07 compute-2 sudo[111583]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:08 compute-2 sudo[111735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otvkvbtroqakoxkhmwstbkogmmuqohei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583027.8772924-198-35496838776593/AnsiballZ_command.py'
Dec 01 09:57:08 compute-2 sudo[111735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:08.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:08 compute-2 ceph-mon[76053]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:57:08 compute-2 python3.9[111737]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wbvxs786' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:57:08 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 09:57:08 compute-2 sudo[111735]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:08 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:08.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:09 compute-2 sudo[111891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcfnqkubyefbyosghjgqlwrgmcddgmlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583028.824741-222-164924881228850/AnsiballZ_file.py'
Dec 01 09:57:09 compute-2 sudo[111891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:09 compute-2 python3.9[111894]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.wbvxs786 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:09 compute-2 sudo[111891]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:09 compute-2 sshd-session[110535]: Connection closed by 192.168.122.30 port 51444
Dec 01 09:57:09 compute-2 sshd-session[110531]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:57:09 compute-2 systemd-logind[795]: Session 44 logged out. Waiting for processes to exit.
Dec 01 09:57:09 compute-2 systemd[1]: session-44.scope: Deactivated successfully.
Dec 01 09:57:09 compute-2 systemd[1]: session-44.scope: Consumed 5.143s CPU time.
Dec 01 09:57:09 compute-2 systemd-logind[795]: Removed session 44.
Dec 01 09:57:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:10.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:10 compute-2 ceph-mon[76053]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:57:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:10 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:10.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:12.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:12 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:12 compute-2 ceph-mon[76053]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:57:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:12.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:14.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:14 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:14 compute-2 ceph-mon[76053]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:14.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:15 compute-2 sshd-session[111924]: Accepted publickey for zuul from 192.168.122.30 port 40420 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:57:15 compute-2 systemd-logind[795]: New session 45 of user zuul.
Dec 01 09:57:15 compute-2 systemd[1]: Started Session 45 of User zuul.
Dec 01 09:57:15 compute-2 sshd-session[111924]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:57:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:16 compute-2 python3.9[112078]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:57:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:16.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:16 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f061400a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:16.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:16 compute-2 ceph-mon[76053]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:17 compute-2 sudo[112166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:57:17 compute-2 sudo[112166]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:17 compute-2 sudo[112166]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:17 compute-2 sudo[112259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgahkaofdeqvzbquyzsgkesvkmazwgzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583036.7560437-58-267730761142429/AnsiballZ_systemd.py'
Dec 01 09:57:17 compute-2 sudo[112259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:17 compute-2 python3.9[112261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 09:57:17 compute-2 sudo[112259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:18 compute-2 sudo[112413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qebhtpqfkhtvhzpphalvtihdkscgtxvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583038.1009476-82-233747029111743/AnsiballZ_systemd.py'
Dec 01 09:57:18 compute-2 sudo[112413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:18 compute-2 kernel: ganesha.nfsd[111005]: segfault at 50 ip 00007f06bea0432e sp 00007f068e7fb210 error 4 in libntirpc.so.5.8[7f06be9e9000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 01 09:57:18 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 09:57:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:18 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003240 fd 39 proxy ignored for local
Dec 01 09:57:18 compute-2 systemd[1]: Started Process Core Dump (PID 112416/UID 0).
Dec 01 09:57:18 compute-2 python3.9[112415]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 09:57:18 compute-2 sudo[112413]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:18.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:18 compute-2 ceph-mon[76053]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:57:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:19 compute-2 sudo[112570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmwhcjgxkiaksqbwjdsynfpqfdgnfxlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583039.1510546-109-270908324542325/AnsiballZ_command.py'
Dec 01 09:57:19 compute-2 sudo[112570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:19 compute-2 python3.9[112572]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:57:19 compute-2 sudo[112570]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:20 compute-2 systemd-coredump[112417]: Process 102402 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 58:
                                                    #0  0x00007f06bea0432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 09:57:20 compute-2 systemd[1]: systemd-coredump@2-112416-0.service: Deactivated successfully.
Dec 01 09:57:20 compute-2 systemd[1]: systemd-coredump@2-112416-0.service: Consumed 1.508s CPU time.
Dec 01 09:57:20 compute-2 podman[112654]: 2025-12-01 09:57:20.212019625 +0000 UTC m=+0.028551907 container died 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Dec 01 09:57:20 compute-2 systemd[1]: var-lib-containers-storage-overlay-a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac-merged.mount: Deactivated successfully.
Dec 01 09:57:20 compute-2 podman[112654]: 2025-12-01 09:57:20.256783178 +0000 UTC m=+0.073315430 container remove 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 01 09:57:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 09:57:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:20.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 09:57:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.885s CPU time.
Dec 01 09:57:20 compute-2 sudo[112769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftyeidklgbxhuiowupciblijsfnsjdtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583040.103664-133-238308560613653/AnsiballZ_stat.py'
Dec 01 09:57:20 compute-2 sudo[112769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:20 compute-2 python3.9[112771]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:57:20 compute-2 sudo[112769]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:20.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:20 compute-2 ceph-mon[76053]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:21 compute-2 sudo[112923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcdttceusfddhvkrtanxnbvakcskyunf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583041.2029014-160-103689438300892/AnsiballZ_file.py'
Dec 01 09:57:21 compute-2 sudo[112923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:21 compute-2 python3.9[112925]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:21 compute-2 sudo[112923]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:22.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:22 compute-2 sshd-session[111928]: Connection closed by 192.168.122.30 port 40420
Dec 01 09:57:22 compute-2 sshd-session[111924]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:57:22 compute-2 systemd[1]: session-45.scope: Deactivated successfully.
Dec 01 09:57:22 compute-2 systemd[1]: session-45.scope: Consumed 4.021s CPU time.
Dec 01 09:57:22 compute-2 systemd-logind[795]: Session 45 logged out. Waiting for processes to exit.
Dec 01 09:57:22 compute-2 systemd-logind[795]: Removed session 45.
Dec 01 09:57:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:22.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:22 compute-2 ceph-mon[76053]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:57:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:24 compute-2 ceph-mon[76053]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:24.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095724 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:57:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:24.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:57:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:26 compute-2 ceph-mon[76053]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:26.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:26.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:28 compute-2 ceph-mon[76053]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:57:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:28.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:28 compute-2 sshd-session[112956]: Accepted publickey for zuul from 192.168.122.30 port 53710 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:57:28 compute-2 systemd-logind[795]: New session 46 of user zuul.
Dec 01 09:57:28 compute-2 systemd[1]: Started Session 46 of User zuul.
Dec 01 09:57:28 compute-2 sshd-session[112956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:57:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:28.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:29 compute-2 python3.9[113110]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:57:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:30 compute-2 ceph-mon[76053]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:57:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:30 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 3.
Dec 01 09:57:30 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:57:30 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.885s CPU time.
Dec 01 09:57:30 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:57:30 compute-2 sudo[113265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjxzqvpskmgsqhudazbkdzuquldoyoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583049.9459074-64-53541065192765/AnsiballZ_setup.py'
Dec 01 09:57:30 compute-2 sudo[113265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:30 compute-2 podman[113311]: 2025-12-01 09:57:30.679382699 +0000 UTC m=+0.046323852 container create 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 01 09:57:30 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:57:30 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:57:30 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:57:30 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:57:30 compute-2 podman[113311]: 2025-12-01 09:57:30.751481578 +0000 UTC m=+0.118422781 container init 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 09:57:30 compute-2 podman[113311]: 2025-12-01 09:57:30.658930129 +0000 UTC m=+0.025871302 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:57:30 compute-2 podman[113311]: 2025-12-01 09:57:30.756631944 +0000 UTC m=+0.123573107 container start 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:57:30 compute-2 bash[113311]: 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d
Dec 01 09:57:30 compute-2 python3.9[113268]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:57:30 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:57:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:30.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:31 compute-2 sudo[113265]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:31 compute-2 sudo[113452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drxogycnwubwltepbqisshllafeqxkdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583049.9459074-64-53541065192765/AnsiballZ_dnf.py'
Dec 01 09:57:31 compute-2 sudo[113452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:31 compute-2 python3.9[113454]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 01 09:57:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:32.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:32 compute-2 ceph-mon[76053]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:57:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:32 compute-2 sudo[113452]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:33 compute-2 python3.9[113607]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:57:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:34.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:34 compute-2 ceph-mon[76053]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:57:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:34.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:35 compute-2 python3.9[113759]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 09:57:35 compute-2 python3.9[113910]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:57:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:36.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:36 compute-2 ceph-mon[76053]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:57:36 compute-2 python3.9[114060]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:57:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:57:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:57:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:37 compute-2 sshd-session[112959]: Connection closed by 192.168.122.30 port 53710
Dec 01 09:57:37 compute-2 sshd-session[112956]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:57:37 compute-2 systemd[1]: session-46.scope: Deactivated successfully.
Dec 01 09:57:37 compute-2 systemd[1]: session-46.scope: Consumed 5.825s CPU time.
Dec 01 09:57:37 compute-2 systemd-logind[795]: Session 46 logged out. Waiting for processes to exit.
Dec 01 09:57:37 compute-2 systemd-logind[795]: Removed session 46.
Dec 01 09:57:37 compute-2 sudo[114087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:57:37 compute-2 sudo[114087]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:37 compute-2 sudo[114087]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:38.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:38 compute-2 ceph-mon[76053]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:57:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:38.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:40.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:40 compute-2 ceph-mon[76053]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:57:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:57:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:40.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:42 compute-2 ceph-mon[76053]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:57:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:42.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:43 compute-2 sshd-session[114133]: Accepted publickey for zuul from 192.168.122.30 port 39998 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:57:43 compute-2 systemd-logind[795]: New session 47 of user zuul.
Dec 01 09:57:43 compute-2 systemd[1]: Started Session 47 of User zuul.
Dec 01 09:57:43 compute-2 sshd-session[114133]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:44 compute-2 ceph-mon[76053]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:57:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:44 compute-2 python3.9[114286]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:57:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:44.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:46 compute-2 sudo[114442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbimfpnpufenphczftsojycbxkzqzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583065.6837049-113-72242327331894/AnsiballZ_file.py'
Dec 01 09:57:46 compute-2 sudo[114442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:46 compute-2 python3.9[114444]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:46 compute-2 sudo[114442]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:46 compute-2 ceph-mon[76053]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:57:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095746 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:57:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:46 compute-2 sudo[114594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruqrxvgpobqxepaubbooyipyofhgqgwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583066.5215006-113-228506830227188/AnsiballZ_file.py'
Dec 01 09:57:46 compute-2 sudo[114594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:46.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:47 compute-2 python3.9[114596]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:47 compute-2 sudo[114594]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:47 compute-2 sudo[114748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdejnguvbkmkqkuhqxjchcczuckjedkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583067.2458491-154-127686805958601/AnsiballZ_stat.py'
Dec 01 09:57:47 compute-2 sudo[114748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:47 compute-2 python3.9[114750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:47 compute-2 sudo[114748]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:48.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:48 compute-2 sudo[114871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnzriiwkcrwwzqekjaiwyrecrfljtik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583067.2458491-154-127686805958601/AnsiballZ_copy.py'
Dec 01 09:57:48 compute-2 sudo[114871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:48 compute-2 ceph-mon[76053]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:57:48 compute-2 python3.9[114873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583067.2458491-154-127686805958601/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=20685843cc777b97f3f9ed43b7fa90867261a4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:48 compute-2 sudo[114871]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:48.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:49 compute-2 sudo[115024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsiwcqqugxclbaffexxkkudztydvrjqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583068.7933075-154-70010991880615/AnsiballZ_stat.py'
Dec 01 09:57:49 compute-2 sudo[115024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:49 compute-2 python3.9[115026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:49 compute-2 sudo[115024]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:49 compute-2 sudo[115148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcrzapkunumrrjvnyofvyxxoadtpurue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583068.7933075-154-70010991880615/AnsiballZ_copy.py'
Dec 01 09:57:49 compute-2 sudo[115148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:49 compute-2 python3.9[115150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583068.7933075-154-70010991880615/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=cb547b0bb0278866a992ba3ec36d52c9fc332990 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:49 compute-2 sudo[115148]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:50 compute-2 sudo[115300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfebvqscwyoladlivtldfrlinvcwcyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583069.9111946-154-273807775790573/AnsiballZ_stat.py'
Dec 01 09:57:50 compute-2 sudo[115300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:50 compute-2 python3.9[115302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:50 compute-2 sudo[115300]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:50.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:50 compute-2 sudo[115423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaekvbcijcjbhkebzmtvfbpnkhfowlgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583069.9111946-154-273807775790573/AnsiballZ_copy.py'
Dec 01 09:57:50 compute-2 sudo[115423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:50 compute-2 ceph-mon[76053]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 09:57:50 compute-2 python3.9[115425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583069.9111946-154-273807775790573/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=4ba8c031c9af75c52f7e52cce117cb7e27d6734c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:50 compute-2 sudo[115423]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:50.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:50 compute-2 sudo[115451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:57:50 compute-2 sudo[115451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:50 compute-2 sudo[115451]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:51 compute-2 sudo[115476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:57:51 compute-2 sudo[115476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:51 compute-2 sudo[115641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwdrchnmajfzqhkhquozruzplkfrwnfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583071.0447583-281-61817772590923/AnsiballZ_file.py'
Dec 01 09:57:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:51 compute-2 sudo[115641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:51 compute-2 python3.9[115643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:51 compute-2 sudo[115476]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:51 compute-2 sudo[115641]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:51 compute-2 sudo[115810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwsrusjtnhtfcbzqxnwdtqpcjnslivsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583071.6649702-281-71072290165046/AnsiballZ_file.py'
Dec 01 09:57:51 compute-2 sudo[115810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:52 compute-2 python3.9[115812]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:52 compute-2 sudo[115810]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:52 compute-2 ceph-mon[76053]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:57:52 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:57:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:52 compute-2 sudo[115962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fezzoweslovfemdiyhxokvpuvgjxgood ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583072.2810886-325-150437733743322/AnsiballZ_stat.py'
Dec 01 09:57:52 compute-2 sudo[115962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:52 compute-2 python3.9[115964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:52 compute-2 sudo[115962]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:52.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:53 compute-2 sudo[116086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvluszbfjsidvlkmbcsrhgihgjoyebao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583072.2810886-325-150437733743322/AnsiballZ_copy.py'
Dec 01 09:57:53 compute-2 sudo[116086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:53 compute-2 python3.9[116088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583072.2810886-325-150437733743322/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8989868de8890869ff35fea6d52127b9ad32b210 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:53 compute-2 sudo[116086]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:53 compute-2 sudo[116239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgacxbcbymwdlngsodwdajyphnoyalcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583073.396756-325-6695473273333/AnsiballZ_stat.py'
Dec 01 09:57:53 compute-2 sudo[116239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:53 compute-2 ceph-mon[76053]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 187 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:57:53 compute-2 python3.9[116241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:53 compute-2 sudo[116239]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095754 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:57:54 compute-2 sudo[116362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-potfnwmsmyeputnwgzuqhyiqcrxuuoez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583073.396756-325-6695473273333/AnsiballZ_copy.py'
Dec 01 09:57:54 compute-2 sudo[116362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:54.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:54 compute-2 python3.9[116364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583073.396756-325-6695473273333/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:54 compute-2 sudo[116362]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:54 compute-2 sudo[116514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eujiuprwbukpsixouaprkmfzjtlqynex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583074.5305223-325-136540502225084/AnsiballZ_stat.py'
Dec 01 09:57:54 compute-2 sudo[116514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:57:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:54.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:54 compute-2 python3.9[116516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:54 compute-2 sudo[116514]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:55 compute-2 sudo[116639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxtaezqwinxuvyjcyqfxclkeanaxako ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583074.5305223-325-136540502225084/AnsiballZ_copy.py'
Dec 01 09:57:55 compute-2 sudo[116639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780095a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:55 compute-2 python3.9[116641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583074.5305223-325-136540502225084/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b97d4b4885978a0a0d2e5e3d6c6e7b50f92a0272 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:55 compute-2 sudo[116639]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:55 compute-2 ceph-mon[76053]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 187 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:57:56 compute-2 sudo[116791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwlzcxajhlhzcsdsxiewzcxjpgsbpxqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583075.736276-451-77153859471725/AnsiballZ_file.py'
Dec 01 09:57:56 compute-2 sudo[116791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:56 compute-2 python3.9[116793]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:56 compute-2 sudo[116791]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:57:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:56.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:57:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:56 compute-2 sudo[116943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vendxdfllvkxfujxsagoukqszijfsqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583076.3574154-451-24863224109056/AnsiballZ_file.py'
Dec 01 09:57:56 compute-2 sudo[116943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:56 compute-2 python3.9[116945]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:57:56 compute-2 sudo[116943]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:57:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:56.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:57:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:57 compute-2 ceph-mon[76053]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 281 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:57:57 compute-2 sudo[117096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxuoosmpmodigdhuovpanwapjanidtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583077.0082722-494-221483209263636/AnsiballZ_stat.py'
Dec 01 09:57:57 compute-2 sudo[117096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:57 compute-2 sudo[117099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:57:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:57 compute-2 sudo[117099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:57 compute-2 sudo[117099]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:57 compute-2 python3.9[117100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:57 compute-2 sudo[117096]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:57 compute-2 sudo[117125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:57:57 compute-2 sudo[117125]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:57:57 compute-2 sudo[117125]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:57:57 compute-2 sudo[117270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esikftikxdpmzkluuyydvxwrubczgdxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583077.0082722-494-221483209263636/AnsiballZ_copy.py'
Dec 01 09:57:57 compute-2 sudo[117270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:57 compute-2 python3.9[117272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583077.0082722-494-221483209263636/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=25d812174fecba3995f204562b7eb9454b9bc312 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:58 compute-2 sudo[117270]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:57:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:58 compute-2 sudo[117422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vazcbpbzfzodxlractpcxvfnvdfvcpfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583078.125759-494-229030044106469/AnsiballZ_stat.py'
Dec 01 09:57:58 compute-2 sudo[117422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:58 compute-2 python3.9[117424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:58 compute-2 sudo[117422]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:57:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:57:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:57:58 compute-2 sudo[117546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-solysivdimeyqfpskagyahvxbsioptdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583078.125759-494-229030044106469/AnsiballZ_copy.py'
Dec 01 09:57:58 compute-2 sudo[117546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:59 compute-2 python3.9[117548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583078.125759-494-229030044106469/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:57:59 compute-2 ceph-mon[76053]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 187 B/s rd, 0 op/s
Dec 01 09:57:59 compute-2 sudo[117546]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:59 compute-2 sudo[117699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnluovexorehafzgwikdjlavqiupmfnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583079.2615595-494-235434918205183/AnsiballZ_stat.py'
Dec 01 09:57:59 compute-2 sudo[117699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:57:59 compute-2 python3.9[117701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:57:59 compute-2 sudo[117699]: pam_unix(sudo:session): session closed for user root
Dec 01 09:57:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:57:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:57:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:00 compute-2 sudo[117822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztbllzughzpxqursqikupnzjlvizxzqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583079.2615595-494-235434918205183/AnsiballZ_copy.py'
Dec 01 09:58:00 compute-2 sudo[117822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:00 compute-2 python3.9[117824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583079.2615595-494-235434918205183/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=778920b7a052bb42e67e37298fd1367348ea2518 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:00 compute-2 sudo[117822]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:00.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:00.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:01 compute-2 ceph-mon[76053]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 187 B/s rd, 0 op/s
Dec 01 09:58:01 compute-2 sudo[117976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqsgpxwvurousmntkrobfoqptypfldfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583081.0324862-646-107508753633579/AnsiballZ_file.py'
Dec 01 09:58:01 compute-2 sudo[117976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:01 compute-2 python3.9[117978]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:01 compute-2 sudo[117976]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:01 compute-2 sudo[118128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlhloxqrbxuvntimtxdxdzhvcwbxjwnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583081.6949613-679-258138505942696/AnsiballZ_stat.py'
Dec 01 09:58:01 compute-2 sudo[118128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:02 compute-2 python3.9[118130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:02 compute-2 sudo[118128]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:58:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:02.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:02 compute-2 sudo[118251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdhvdhlayepofxslcfvrviqyzhumnujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583081.6949613-679-258138505942696/AnsiballZ_copy.py'
Dec 01 09:58:02 compute-2 sudo[118251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:02 compute-2 python3.9[118253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583081.6949613-679-258138505942696/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:02 compute-2 sudo[118251]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:02.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:03 compute-2 sudo[118404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbbpicesigvfluxrprmoinlwzpwzmjzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583082.9136004-735-125827042316235/AnsiballZ_file.py'
Dec 01 09:58:03 compute-2 sudo[118404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:03 compute-2 ceph-mon[76053]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 563 B/s rd, 93 B/s wr, 0 op/s
Dec 01 09:58:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:03 compute-2 python3.9[118406]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:03 compute-2 sudo[118404]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:03 compute-2 sudo[118557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgdxuauuqparilcsfuxpfwirypvwwpyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583083.560652-758-241209036488372/AnsiballZ_stat.py'
Dec 01 09:58:03 compute-2 sudo[118557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:04 compute-2 python3.9[118559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:04 compute-2 sudo[118557]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:04.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:04 compute-2 sudo[118680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmpgmybwdjzrywlakwbgodqxyvsciofv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583083.560652-758-241209036488372/AnsiballZ_copy.py'
Dec 01 09:58:04 compute-2 sudo[118680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:04 compute-2 python3.9[118682]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583083.560652-758-241209036488372/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:04 compute-2 sudo[118680]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:04.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:05 compute-2 sudo[118833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfdcmqypyuulqprrigcaeeunzjrflbin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583084.8278072-806-155719395885057/AnsiballZ_file.py'
Dec 01 09:58:05 compute-2 sudo[118833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:05 compute-2 python3.9[118835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:05 compute-2 ceph-mon[76053]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:58:05 compute-2 sudo[118833]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:58:05 compute-2 sudo[118986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sibluwihzphnscndmethquccerteonuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583085.4696105-828-133307917804415/AnsiballZ_stat.py'
Dec 01 09:58:05 compute-2 sudo[118986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:05 compute-2 python3.9[118988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:05 compute-2 sudo[118986]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:06 compute-2 sudo[119109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkwqcrlugtbhytmmzipwqyqjhcpbnxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583085.4696105-828-133307917804415/AnsiballZ_copy.py'
Dec 01 09:58:06 compute-2 sudo[119109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:06.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:06 compute-2 python3.9[119111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583085.4696105-828-133307917804415/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:06 compute-2 sudo[119109]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:06.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:06 compute-2 sudo[119262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxeugflgghqqbdqzqhxlbgesfjndqids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583086.7232282-872-35419630039845/AnsiballZ_file.py'
Dec 01 09:58:06 compute-2 sudo[119262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:07 compute-2 python3.9[119264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:07 compute-2 sudo[119262]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:07 compute-2 ceph-mon[76053]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 09:58:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:07 compute-2 sudo[119415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvzfdtqhdwsapjobesxpmohinfcanmbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583087.3340476-894-168257380049224/AnsiballZ_stat.py'
Dec 01 09:58:07 compute-2 sudo[119415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:07 compute-2 python3.9[119417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:07 compute-2 sudo[119415]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:08 compute-2 sudo[119538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwqkugfzzaqijfjqwfqsdmnitmfsttov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583087.3340476-894-168257380049224/AnsiballZ_copy.py'
Dec 01 09:58:08 compute-2 sudo[119538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:08 compute-2 python3.9[119540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583087.3340476-894-168257380049224/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:08 compute-2 sudo[119538]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:58:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:08.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:08 compute-2 sudo[119690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oakfeiwjcajszwzuwibkuqkxpdgjjlam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583088.4928784-942-238064477366034/AnsiballZ_file.py'
Dec 01 09:58:08 compute-2 sudo[119690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:08 compute-2 python3.9[119692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:08 compute-2 sudo[119690]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:09 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:09 compute-2 sudo[119844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utkembwhvxljvturtqsjgdexwoptqdwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583089.1015177-966-9943983322651/AnsiballZ_stat.py'
Dec 01 09:58:09 compute-2 sudo[119844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:09 compute-2 ceph-mon[76053]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:58:09 compute-2 python3.9[119846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:09 compute-2 sudo[119844]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:09 compute-2 sudo[119967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmauvbuxdxluktaumfsnxvlssboqchig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583089.1015177-966-9943983322651/AnsiballZ_copy.py'
Dec 01 09:58:09 compute-2 sudo[119967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:09 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:10 compute-2 python3.9[119969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583089.1015177-966-9943983322651/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:10 compute-2 sudo[119967]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:58:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:10.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:10 compute-2 sudo[120119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cknrsgohoowcoexzgisrnmffqropvyll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583090.279109-1011-252531559712036/AnsiballZ_file.py'
Dec 01 09:58:10 compute-2 sudo[120119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:10 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:10 compute-2 python3.9[120121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:10 compute-2 sudo[120119]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:10.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:11 compute-2 sudo[120272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wobbuybuudpsaectohxttkaxysusopxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583090.8797774-1033-198542904039486/AnsiballZ_stat.py'
Dec 01 09:58:11 compute-2 sudo[120272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:11 compute-2 python3.9[120274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:11 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:11 compute-2 sudo[120272]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:11 compute-2 ceph-mon[76053]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:58:11 compute-2 sudo[120396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgtpbfdiewkifzsrgqpmqzvlsxkcwqzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583090.8797774-1033-198542904039486/AnsiballZ_copy.py'
Dec 01 09:58:11 compute-2 sudo[120396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:11 compute-2 python3.9[120398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583090.8797774-1033-198542904039486/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:11 compute-2 sudo[120396]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:11 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:12.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:12 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:12.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:13 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:13 compute-2 ceph-mon[76053]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:58:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:13 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095814 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:58:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:14 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:14.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:15 compute-2 sshd-session[114136]: Connection closed by 192.168.122.30 port 39998
Dec 01 09:58:15 compute-2 sshd-session[114133]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:58:15 compute-2 systemd[1]: session-47.scope: Deactivated successfully.
Dec 01 09:58:15 compute-2 systemd[1]: session-47.scope: Consumed 22.134s CPU time.
Dec 01 09:58:15 compute-2 systemd-logind[795]: Session 47 logged out. Waiting for processes to exit.
Dec 01 09:58:15 compute-2 systemd-logind[795]: Removed session 47.
Dec 01 09:58:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:15 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:15 compute-2 ceph-mon[76053]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:58:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:15 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:16.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:16 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:58:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:58:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:17 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:17 compute-2 ceph-mon[76053]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 09:58:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:17 compute-2 sudo[120430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:58:17 compute-2 sudo[120430]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:17 compute-2 sudo[120430]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:17 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:18.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:18 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.773393) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098773649, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1249, "num_deletes": 251, "total_data_size": 3169801, "memory_usage": 3223560, "flush_reason": "Manual Compaction"}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098787707, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2042220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12051, "largest_seqno": 13295, "table_properties": {"data_size": 2036843, "index_size": 2837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11298, "raw_average_key_size": 19, "raw_value_size": 2026011, "raw_average_value_size": 3457, "num_data_blocks": 126, "num_entries": 586, "num_filter_entries": 586, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583000, "oldest_key_time": 1764583000, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 14356 microseconds, and 6000 cpu microseconds.
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.787787) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2042220 bytes OK
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.787815) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789664) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789682) EVENT_LOG_v1 {"time_micros": 1764583098789677, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789703) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3163867, prev total WAL file size 3163867, number of live WAL files 2.
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.790789) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1994KB)], [24(13MB)]
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098790922, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15897636, "oldest_snapshot_seqno": -1}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4292 keys, 13793907 bytes, temperature: kUnknown
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098879149, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13793907, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13762354, "index_size": 19731, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109785, "raw_average_key_size": 25, "raw_value_size": 13680982, "raw_average_value_size": 3187, "num_data_blocks": 832, "num_entries": 4292, "num_filter_entries": 4292, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 09:58:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.879515) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13793907 bytes
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.915513) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.9 rd, 156.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.2 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(14.5) write-amplify(6.8) OK, records in: 4810, records dropped: 518 output_compression: NoCompression
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.915570) EVENT_LOG_v1 {"time_micros": 1764583098915550, "job": 12, "event": "compaction_finished", "compaction_time_micros": 88350, "compaction_time_cpu_micros": 30770, "output_level": 6, "num_output_files": 1, "total_output_size": 13793907, "num_input_records": 4810, "num_output_records": 4292, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098916330, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098919881, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.790476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920015) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 09:58:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:19 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:19 compute-2 ceph-mon[76053]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:58:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:19 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:20 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:20.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:21 compute-2 sshd-session[120458]: Accepted publickey for zuul from 192.168.122.30 port 52600 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:58:21 compute-2 systemd-logind[795]: New session 48 of user zuul.
Dec 01 09:58:21 compute-2 systemd[1]: Started Session 48 of User zuul.
Dec 01 09:58:21 compute-2 sshd-session[120458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:58:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:21 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:21 compute-2 sudo[120612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xessgsyltajjbpbkcjsfdwktkjlgwnlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583101.3542907-29-161690671841183/AnsiballZ_file.py'
Dec 01 09:58:21 compute-2 sudo[120612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:21 compute-2 ceph-mon[76053]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:58:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:21 compute-2 python3.9[120614]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:21 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:21 compute-2 sudo[120612]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:22.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:22 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:22 compute-2 sudo[120764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apzsoivlghncxnhvpnphspqzyigdyepc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583102.253558-65-50319734965389/AnsiballZ_stat.py'
Dec 01 09:58:22 compute-2 sudo[120764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:22 compute-2 python3.9[120766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:22 compute-2 sudo[120764]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:22.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:23 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:23 compute-2 sudo[120889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uytaadyzsbvmtjdubipiicrmznbpfafa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583102.253558-65-50319734965389/AnsiballZ_copy.py'
Dec 01 09:58:23 compute-2 sudo[120889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:23 compute-2 python3.9[120891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583102.253558-65-50319734965389/.source.conf _original_basename=ceph.conf follow=False checksum=0a8180f0f80a13ef358ded9b1ade2f059a9b256f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:23 compute-2 sudo[120889]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:23 compute-2 ceph-mon[76053]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:58:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:23 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:24 compute-2 sudo[121041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngfsfajxiopwigyrigrsxcydkwrzjnfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583103.7355587-65-236270392236463/AnsiballZ_stat.py'
Dec 01 09:58:24 compute-2 sudo[121041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:24 compute-2 python3.9[121043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:24 compute-2 sudo[121041]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:24.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:24 compute-2 sudo[121164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vngctofwvmirreeikntyntiicmffpfyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583103.7355587-65-236270392236463/AnsiballZ_copy.py'
Dec 01 09:58:24 compute-2 sudo[121164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:24 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:24 compute-2 python3.9[121166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583103.7355587-65-236270392236463/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=5a16a5bd4a7ebcbad903a4d80924389de6535d80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:24 compute-2 sudo[121164]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:58:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:24.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:25 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:25 compute-2 sshd-session[120462]: Connection closed by 192.168.122.30 port 52600
Dec 01 09:58:25 compute-2 sshd-session[120458]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:58:25 compute-2 systemd[1]: session-48.scope: Deactivated successfully.
Dec 01 09:58:25 compute-2 systemd[1]: session-48.scope: Consumed 2.719s CPU time.
Dec 01 09:58:25 compute-2 systemd-logind[795]: Session 48 logged out. Waiting for processes to exit.
Dec 01 09:58:25 compute-2 systemd-logind[795]: Removed session 48.
Dec 01 09:58:25 compute-2 ceph-mon[76053]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:58:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:25 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:26.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:26 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:26.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:27 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:27 compute-2 ceph-mon[76053]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:58:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:27 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:28.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:28 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:28.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:29 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078001d70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:29 compute-2 ceph-mon[76053]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:29 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:30.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:30 compute-2 sshd-session[121199]: Accepted publickey for zuul from 192.168.122.30 port 41948 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:58:30 compute-2 systemd-logind[795]: New session 49 of user zuul.
Dec 01 09:58:30 compute-2 systemd[1]: Started Session 49 of User zuul.
Dec 01 09:58:30 compute-2 sshd-session[121199]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:58:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:30.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:31 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:31 compute-2 python3.9[121354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:58:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:31 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:32 compute-2 ceph-mon[76053]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:32.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:32 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:32.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:32 compute-2 sudo[121509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrthncmgntenycnmqdculplkmxynazmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583112.5740793-64-199256607873027/AnsiballZ_file.py'
Dec 01 09:58:33 compute-2 sudo[121509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:33 compute-2 ceph-mon[76053]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:58:33 compute-2 python3.9[121511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:33 compute-2 sudo[121509]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:33 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:33 compute-2 sudo[121662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcasodbtfrihctfvzxxtuaozypcibuxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583113.3487818-64-120335691580399/AnsiballZ_file.py'
Dec 01 09:58:33 compute-2 sudo[121662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:33 compute-2 python3.9[121664]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:58:33 compute-2 sudo[121662]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:33 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:34.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:34 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:34 compute-2 python3.9[121814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:58:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:34.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:35 compute-2 ceph-mon[76053]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:35 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:35 compute-2 sudo[121966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqrtwcpiycvcaymznftbaunxoscqpjuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583115.0606954-133-162001177195773/AnsiballZ_seboolean.py'
Dec 01 09:58:35 compute-2 sudo[121966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:35 compute-2 python3.9[121968]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 01 09:58:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:35 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:36.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:36 compute-2 sudo[121966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:36.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:37 compute-2 ceph-mon[76053]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:58:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:37 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:37 compute-2 sudo[122124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wziurtwixxccurycbceupvwhvlllaczm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583117.3714778-163-203107852326027/AnsiballZ_setup.py'
Dec 01 09:58:37 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 01 09:58:37 compute-2 sudo[122124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:37 compute-2 sudo[122126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:58:37 compute-2 sudo[122126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:37 compute-2 sudo[122126]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:37 compute-2 python3.9[122127]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 09:58:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:37 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780012d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:38 compute-2 sudo[122124]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:38.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:38 compute-2 sudo[122233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uldooexvvaevcksidlfzlpqursksoeoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583117.3714778-163-203107852326027/AnsiballZ_dnf.py'
Dec 01 09:58:38 compute-2 sudo[122233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:38 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:38 compute-2 python3.9[122235]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 09:58:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:38.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:39 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:39 compute-2 ceph-mon[76053]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:40 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:40 compute-2 sudo[122233]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:58:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:40 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078001470 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:40.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:40 compute-2 sudo[122389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yikmsazwlspekxuzaundxtpbljfilumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583120.4134412-199-259953741721273/AnsiballZ_systemd.py'
Dec 01 09:58:40 compute-2 sudo[122389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:41 compute-2 python3.9[122391]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 09:58:41 compute-2 sudo[122389]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:41 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:41 compute-2 ceph-mon[76053]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:42 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:42 compute-2 sudo[122545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcjfzleelwwhwxqqtgjxgigoqbyjbvwd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583121.6577682-224-222917157619358/AnsiballZ_edpm_nftables_snippet.py'
Dec 01 09:58:42 compute-2 sudo[122545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:42 compute-2 python3[122547]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 01 09:58:42 compute-2 sudo[122545]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:58:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:42.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:58:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:42 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:42 compute-2 sudo[122699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwjieqvazvcwgyjoznwvrbdvqdeyejjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583122.6167595-250-30525227870591/AnsiballZ_file.py'
Dec 01 09:58:42 compute-2 sudo[122699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:42.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:43 compute-2 python3.9[122701]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:43 compute-2 sudo[122699]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:43 compute-2 ceph-mon[76053]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:58:43 compute-2 sudo[122852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmfqwfnjygdrxzyffmfvcsohzyyvhdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583123.5238426-276-161315999583584/AnsiballZ_stat.py'
Dec 01 09:58:43 compute-2 sudo[122852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:44 compute-2 python3.9[122854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:44 compute-2 sudo[122852]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:44 compute-2 sudo[122930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjhxjydwqlxkeftqeentsmvmjmxaekv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583123.5238426-276-161315999583584/AnsiballZ_file.py'
Dec 01 09:58:44 compute-2 sudo[122930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:44.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:44 compute-2 python3.9[122932]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:44 compute-2 sudo[122930]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:58:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:44.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:58:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:45 compute-2 sudo[123083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmjunrtuqzxinhguceofwptpqujeabr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583124.8454444-310-179264436667521/AnsiballZ_stat.py'
Dec 01 09:58:45 compute-2 sudo[123083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:45 compute-2 python3.9[123085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:45 compute-2 sudo[123083]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:45 compute-2 ceph-mon[76053]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:45 compute-2 sudo[123162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrwurfvctclechineyddcxizsckqxcvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583124.8454444-310-179264436667521/AnsiballZ_file.py'
Dec 01 09:58:45 compute-2 sudo[123162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:45 compute-2 python3.9[123164]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.2y7_inlh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:45 compute-2 sudo[123162]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:46 compute-2 sudo[123315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmdjkoetzidjintbmdebmenmuggjnof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583126.0856702-347-207974433495691/AnsiballZ_stat.py'
Dec 01 09:58:46 compute-2 sudo[123315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:46.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:46 compute-2 python3.9[123317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:46 compute-2 sudo[123315]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:46 compute-2 sudo[123394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpkiaxeettuuzmmecalroralxauvsvvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583126.0856702-347-207974433495691/AnsiballZ_file.py'
Dec 01 09:58:46 compute-2 sudo[123394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:46.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:47 compute-2 python3.9[123396]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:47 compute-2 sudo[123394]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:47 compute-2 ceph-mon[76053]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:58:47 compute-2 sudo[123547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieejleaquhdmrixwstvusquiwcaftbzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583127.3792958-385-131283035167977/AnsiballZ_command.py'
Dec 01 09:58:47 compute-2 sudo[123547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:47 compute-2 python3.9[123549]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:58:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:48 compute-2 sudo[123547]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:48.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:48 compute-2 sudo[123700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msxhyaeaeadpkookwqzhomggllazeuwl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583128.223979-409-109380204184500/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 09:58:48 compute-2 sudo[123700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:48 compute-2 python3[123702]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 09:58:48 compute-2 sudo[123700]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:49 compute-2 sudo[123854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osvjklpohavuuyziqhxyzbjlqvglvnsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583129.2123625-434-267178170546622/AnsiballZ_stat.py'
Dec 01 09:58:49 compute-2 sudo[123854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:49 compute-2 python3.9[123856]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:49 compute-2 ceph-mon[76053]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:49 compute-2 sudo[123854]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:50 compute-2 sudo[123979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voyiwildourdrvwussmubtvrlbamjfix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583129.2123625-434-267178170546622/AnsiballZ_copy.py'
Dec 01 09:58:50 compute-2 sudo[123979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:50.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:50 compute-2 python3.9[123981]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583129.2123625-434-267178170546622/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:50 compute-2 sudo[123979]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:51 compute-2 sudo[124132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azipufxifhwfnrhwjqnuhkjavntuhakf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583130.797783-479-16963927719355/AnsiballZ_stat.py'
Dec 01 09:58:51 compute-2 sudo[124132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:51 compute-2 python3.9[124134]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:51 compute-2 sudo[124132]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:51 compute-2 sudo[124258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwtpqcuwiqirfhabjhuckplkeefoingd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583130.797783-479-16963927719355/AnsiballZ_copy.py'
Dec 01 09:58:51 compute-2 sudo[124258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:51 compute-2 ceph-mon[76053]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:51 compute-2 python3.9[124260]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583130.797783-479-16963927719355/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:51 compute-2 sudo[124258]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:52.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:52 compute-2 sudo[124410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdyjytfksyqhmjisrxyhtdiohqxzrepq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583132.244903-523-190343345560288/AnsiballZ_stat.py'
Dec 01 09:58:52 compute-2 sudo[124410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:52 compute-2 python3.9[124412]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:52 compute-2 sudo[124410]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:52.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:53 compute-2 sudo[124536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdepsvpuqadpokxlwfpzsuqltliihrxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583132.244903-523-190343345560288/AnsiballZ_copy.py'
Dec 01 09:58:53 compute-2 sudo[124536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:53 compute-2 python3.9[124538]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583132.244903-523-190343345560288/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:53 compute-2 sudo[124536]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:53 compute-2 sudo[124689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdtoslhsqyeeynyitpmqmsmezxakkyue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583133.6191976-569-206927435878724/AnsiballZ_stat.py'
Dec 01 09:58:53 compute-2 sudo[124689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:54 compute-2 ceph-mon[76053]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:58:54 compute-2 python3.9[124691]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:54 compute-2 sudo[124689]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:54 compute-2 sudo[124814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmsljzhrctzfiepgsemxqoqjdkfottkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583133.6191976-569-206927435878724/AnsiballZ_copy.py'
Dec 01 09:58:54 compute-2 sudo[124814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 09:58:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:54.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 09:58:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:54 compute-2 python3.9[124816]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583133.6191976-569-206927435878724/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:54 compute-2 sudo[124814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:54.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:58:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:55 compute-2 sudo[124968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkkflenysoxxfchozmlobwvvlsbyipai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583135.1552536-614-192345320710521/AnsiballZ_stat.py'
Dec 01 09:58:55 compute-2 sudo[124968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:55 compute-2 python3.9[124970]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:58:55 compute-2 sudo[124968]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:56 compute-2 ceph-mon[76053]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:56 compute-2 sudo[125093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzprmjlzeezuwczzazixzurdpvzxuvki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583135.1552536-614-192345320710521/AnsiballZ_copy.py'
Dec 01 09:58:56 compute-2 sudo[125093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:56 compute-2 python3.9[125095]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583135.1552536-614-192345320710521/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:56 compute-2 sudo[125093]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:58:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:56.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:58:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:56 compute-2 sudo[125246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljxygfmngvqaxlrphgnaxhqifswuortd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583136.6631417-658-247309896460052/AnsiballZ_file.py'
Dec 01 09:58:56 compute-2 sudo[125246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:56.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:57 compute-2 python3.9[125248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:57 compute-2 ceph-mon[76053]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:58:57 compute-2 sudo[125246]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:57 compute-2 sudo[125250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:58:57 compute-2 sudo[125250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:57 compute-2 sudo[125250]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:57 compute-2 sudo[125296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 01 09:58:57 compute-2 sudo[125296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:58:57 compute-2 sudo[125412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:58:57 compute-2 sudo[125412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:57 compute-2 sudo[125412]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:57 compute-2 sudo[125296]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:57 compute-2 sudo[125494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqkqzohhjmecrbndcglflxjktxjxbquk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583137.580873-682-106012510462121/AnsiballZ_command.py'
Dec 01 09:58:57 compute-2 sudo[125494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:58 compute-2 python3.9[125496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:58:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:58 compute-2 sudo[125494]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:58 compute-2 sudo[125524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 09:58:58 compute-2 sudo[125524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:58 compute-2 sudo[125524]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:58 compute-2 sudo[125549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 09:58:58 compute-2 sudo[125549]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:58:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:58:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:58.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:58:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:58 compute-2 sudo[125719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-effssthzcxibgbwkdulospgscaxnhtyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583138.3303185-707-123617737740881/AnsiballZ_blockinfile.py'
Dec 01 09:58:58 compute-2 sudo[125719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:58 compute-2 sudo[125549]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:58 compute-2 python3.9[125728]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:58:58 compute-2 sudo[125719]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:58:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:58:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:58.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:58:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:58:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:58:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:58:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 09:58:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:58:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:58:59 compute-2 sudo[125886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsdmiksxycdykdvilaaswipghlqnotry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583139.2866342-734-42382783158991/AnsiballZ_command.py'
Dec 01 09:58:59 compute-2 sudo[125886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:58:59 compute-2 python3.9[125888]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:58:59 compute-2 sudo[125886]: pam_unix(sudo:session): session closed for user root
Dec 01 09:58:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:58:59 compute-2 ceph-mon[76053]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:58:59 compute-2 ceph-mon[76053]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 288 B/s rd, 0 op/s
Dec 01 09:58:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:58:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 09:58:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 09:58:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 09:59:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:00 compute-2 sudo[126039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajowscbwtplhwnslykzbovgrbdxoxmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583139.9849293-758-17430751806223/AnsiballZ_stat.py'
Dec 01 09:59:00 compute-2 sudo[126039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:00 compute-2 python3.9[126041]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:59:00 compute-2 sudo[126039]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:00.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:00.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:00 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:01 compute-2 sudo[126194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjfvnjtrqxfieogugnsvrrbxsrnecjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583140.7832966-782-261213954518434/AnsiballZ_command.py'
Dec 01 09:59:01 compute-2 sudo[126194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:01 compute-2 ceph-mon[76053]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 480 B/s rd, 0 op/s
Dec 01 09:59:01 compute-2 python3.9[126196]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:01 compute-2 sudo[126194]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd040000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:01 compute-2 sudo[126350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmzgjfakcbfhwazzuhzrcvewxkzxuesy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583141.4769468-805-12919974588009/AnsiballZ_file.py'
Dec 01 09:59:01 compute-2 sudo[126350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:01 compute-2 python3.9[126352]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:01 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:01 compute-2 sudo[126350]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:02.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:02.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:02 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:03 compute-2 python3.9[126503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:59:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:03 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:03 compute-2 ceph-mon[76053]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 288 B/s rd, 0 op/s
Dec 01 09:59:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:04 compute-2 sudo[126655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfnqzvtigvuctapxakjjtwozhyjumhvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583144.058957-925-173187462779323/AnsiballZ_command.py'
Dec 01 09:59:04 compute-2 sudo[126655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:04 compute-2 sudo[126658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 09:59:04 compute-2 sudo[126658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:59:04 compute-2 sudo[126658]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:04.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:04 compute-2 python3.9[126657]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:04 compute-2 ovs-vsctl[126683]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 01 09:59:04 compute-2 sudo[126655]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:04.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:04 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:05 compute-2 sudo[126834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeqychpfrlvolcisgwueuerqmqmtszbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583144.9024243-953-222321717038121/AnsiballZ_command.py'
Dec 01 09:59:05 compute-2 sudo[126834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:59:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 09:59:05 compute-2 ceph-mon[76053]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 288 B/s rd, 0 op/s
Dec 01 09:59:05 compute-2 python3.9[126836]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:05 compute-2 sudo[126834]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:05 compute-2 sudo[126990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwqfyygfhaxtsiukeouiwipufxsrodoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583145.699538-976-195567489674709/AnsiballZ_command.py'
Dec 01 09:59:05 compute-2 sudo[126990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:05 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001230 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:06 compute-2 python3.9[126992]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:06 compute-2 ovs-vsctl[126993]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 01 09:59:06 compute-2 sudo[126990]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:06.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:06 compute-2 python3.9[127145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:59:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:06.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:06 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:07 compute-2 sudo[127299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovbskqmjpyhpgewlglyaxuzpmtjeebt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583147.2312653-1027-239495481011080/AnsiballZ_file.py'
Dec 01 09:59:07 compute-2 sudo[127299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:07 compute-2 python3.9[127301]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:07 compute-2 sudo[127299]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:07 compute-2 ceph-mon[76053]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 288 B/s rd, 0 op/s
Dec 01 09:59:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:07 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:08 compute-2 sudo[127451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzhsksspwaqdgbzebidpsafjdektjppp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583148.1089232-1051-237655042787906/AnsiballZ_stat.py'
Dec 01 09:59:08 compute-2 sudo[127451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:08.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:08 compute-2 python3.9[127453]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001230 fd 47 proxy ignored for local
Dec 01 09:59:08 compute-2 kernel: ganesha.nfsd[125830]: segfault at 50 ip 00007fd122e8d32e sp 00007fd0ed7f9210 error 4 in libntirpc.so.5.8[7fd122e72000+2c000] likely on CPU 2 (core 0, socket 2)
Dec 01 09:59:08 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 09:59:08 compute-2 sudo[127451]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:08 compute-2 systemd[1]: Started Process Core Dump (PID 127455/UID 0).
Dec 01 09:59:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:08.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:08 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:09 compute-2 ceph-mon[76053]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 288 B/s rd, 0 op/s
Dec 01 09:59:09 compute-2 sudo[127533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqnnyexqyiwbgldmbtbcsexqedfpysd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583148.1089232-1051-237655042787906/AnsiballZ_file.py'
Dec 01 09:59:09 compute-2 sudo[127533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:09 compute-2 python3.9[127535]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:09 compute-2 sudo[127533]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:09 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:10 compute-2 sudo[127685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdtwcqcbblpadfwefvawhthrmjzjgwqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583149.8222756-1051-16414617840122/AnsiballZ_stat.py'
Dec 01 09:59:10 compute-2 sudo[127685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:10 compute-2 python3.9[127687]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:10 compute-2 sudo[127685]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:10 compute-2 systemd-coredump[127457]: Process 113331 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007fd122e8d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 09:59:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:59:10 compute-2 systemd[1]: systemd-coredump@3-127455-0.service: Deactivated successfully.
Dec 01 09:59:10 compute-2 systemd[1]: systemd-coredump@3-127455-0.service: Consumed 1.531s CPU time.
Dec 01 09:59:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:10.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:10 compute-2 podman[127741]: 2025-12-01 09:59:10.542177009 +0000 UTC m=+0.041053897 container died 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec 01 09:59:10 compute-2 systemd[1]: var-lib-containers-storage-overlay-b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf-merged.mount: Deactivated successfully.
Dec 01 09:59:10 compute-2 sudo[127782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xecpklkndjoqgrfvaosjcblyvnizwsxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583149.8222756-1051-16414617840122/AnsiballZ_file.py'
Dec 01 09:59:10 compute-2 sudo[127782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:10 compute-2 podman[127741]: 2025-12-01 09:59:10.594859268 +0000 UTC m=+0.093736156 container remove 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec 01 09:59:10 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 09:59:10 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 09:59:10 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.770s CPU time.
Dec 01 09:59:10 compute-2 python3.9[127786]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:10 compute-2 sudo[127782]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:10 compute-2 sshd-session[127018]: Connection closed by 45.78.219.119 port 58320 [preauth]
Dec 01 09:59:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:10.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:10 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:11 compute-2 sudo[127966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwvzldosgrveljfzczziqpdhdwmgzlex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583150.9977434-1121-134350166540945/AnsiballZ_file.py'
Dec 01 09:59:11 compute-2 sudo[127966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:11 compute-2 ceph-mon[76053]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 09:59:11 compute-2 python3.9[127968]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:11 compute-2 sudo[127966]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:11 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:12 compute-2 sudo[128118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzpkdpeapsgyactbtbvabgbtenequfmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583151.7334118-1145-63961556754649/AnsiballZ_stat.py'
Dec 01 09:59:12 compute-2 sudo[128118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:12 compute-2 python3.9[128120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:12 compute-2 sudo[128118]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:12 compute-2 sudo[128196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyekyhfzyrxhcihetosvbsysrgscoyzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583151.7334118-1145-63961556754649/AnsiballZ_file.py'
Dec 01 09:59:12 compute-2 sudo[128196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:12 compute-2 python3.9[128198]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:12 compute-2 sudo[128196]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:12 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:12.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:13 compute-2 ceph-mon[76053]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:59:13 compute-2 sudo[128349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utxponqcpxzmuiwfxnbwyawrpummbipq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583152.9597876-1181-51476046656819/AnsiballZ_stat.py'
Dec 01 09:59:13 compute-2 sudo[128349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:13 compute-2 python3.9[128351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:13 compute-2 sudo[128349]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:13 compute-2 sudo[128428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upmvzmevoomdipegxrqkgrpbiorgnymg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583152.9597876-1181-51476046656819/AnsiballZ_file.py'
Dec 01 09:59:13 compute-2 sudo[128428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:13 compute-2 python3.9[128430]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:13 compute-2 sudo[128428]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:13 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:14 compute-2 sudo[128580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljafsqugwqwuynpcclusdfxlocichkaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583154.1960373-1217-146605995425962/AnsiballZ_systemd.py'
Dec 01 09:59:14 compute-2 sudo[128580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095914 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:59:14 compute-2 python3.9[128582]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:59:14 compute-2 systemd[1]: Reloading.
Dec 01 09:59:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:14 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:14 compute-2 systemd-rc-local-generator[128609]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:59:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:14.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:14 compute-2 systemd-sysv-generator[128614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:59:15 compute-2 sudo[128580]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:15 compute-2 sudo[128772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feestrczpqnjjfqaliagxenvkvvbwgjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583155.4541821-1240-197340851568683/AnsiballZ_stat.py'
Dec 01 09:59:15 compute-2 sudo[128772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:15 compute-2 python3.9[128774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:15 compute-2 ceph-mon[76053]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:59:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:15 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:15 compute-2 sudo[128772]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:16 compute-2 sudo[128850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucshiqvztqwbgrgomfvqadwayhqfdgqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583155.4541821-1240-197340851568683/AnsiballZ_file.py'
Dec 01 09:59:16 compute-2 sudo[128850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:16 compute-2 python3.9[128852]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:16 compute-2 sudo[128850]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:16.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:16 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:16.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:17 compute-2 sudo[129003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzixtfdnlmvmowrwmshyglnvbafsnrvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583156.854081-1277-37040834563950/AnsiballZ_stat.py'
Dec 01 09:59:17 compute-2 sudo[129003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:59:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2246 writes, 13K keys, 2246 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2246 writes, 2246 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2246 writes, 13K keys, 2246 commit groups, 1.0 writes per commit group, ingest: 38.31 MB, 0.06 MB/s
                                           Interval WAL: 2246 writes, 2246 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    153.1      0.14              0.06         6    0.024       0      0       0.0       0.0
                                             L6      1/0   13.15 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0    152.6    134.4      0.48              0.17         5    0.097     22K   2300       0.0       0.0
                                            Sum      1/0   13.15 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    117.7    138.7      0.63              0.23        11    0.057     22K   2300       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    118.2    139.2      0.63              0.23        10    0.063     22K   2300       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    152.6    134.4      0.48              0.17         5    0.097     22K   2300       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    155.6      0.14              0.06         5    0.028       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.021, interval 0.021
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds
                                           Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 1.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 7.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(87,1.32 MB,0.432777%) FilterBlock(11,71.55 KB,0.0229836%) IndexBlock(11,138.77 KB,0.0445767%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 09:59:17 compute-2 python3.9[129005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:17 compute-2 sudo[129003]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:17 compute-2 sudo[129082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuqlnzoaednpcsvozsqzdmwrxnuicts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583156.854081-1277-37040834563950/AnsiballZ_file.py'
Dec 01 09:59:17 compute-2 sudo[129082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:17 compute-2 python3.9[129084]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:17 compute-2 sudo[129082]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:17 compute-2 sudo[129097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:59:17 compute-2 sudo[129097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:59:17 compute-2 sudo[129097]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:17 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:17 compute-2 ceph-mon[76053]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:59:18 compute-2 sudo[129259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxoifxyixohzpibzeckkvgknparaomba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583158.1579726-1312-206397954975280/AnsiballZ_systemd.py'
Dec 01 09:59:18 compute-2 sudo[129259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:18.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:18 compute-2 python3.9[129261]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:59:18 compute-2 systemd[1]: Reloading.
Dec 01 09:59:18 compute-2 systemd-rc-local-generator[129291]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:59:18 compute-2 systemd-sysv-generator[129295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:59:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:18 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:19 compute-2 systemd[1]: Starting Create netns directory...
Dec 01 09:59:19 compute-2 ceph-mon[76053]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:59:19 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 09:59:19 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 09:59:19 compute-2 systemd[1]: Finished Create netns directory.
Dec 01 09:59:19 compute-2 sudo[129259]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:19 compute-2 sudo[129456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdvplvexwgqooxnecvubfgqweimqogbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583159.6061966-1343-118986554375383/AnsiballZ_file.py'
Dec 01 09:59:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:19 compute-2 sudo[129456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:19 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:20 compute-2 python3.9[129458]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:20 compute-2 sudo[129456]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:20.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:20 compute-2 sudo[129608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owpqtkfpvfloonlkvvlbrxtikrbgtrhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583160.369572-1366-85539872515758/AnsiballZ_stat.py'
Dec 01 09:59:20 compute-2 sudo[129608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 4.
Dec 01 09:59:20 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:59:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.770s CPU time.
Dec 01 09:59:20 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:59:20 compute-2 python3.9[129610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:20 compute-2 sudo[129608]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:20 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:20.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:21 compute-2 podman[129706]: 2025-12-01 09:59:21.072707763 +0000 UTC m=+0.044307379 container create b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:59:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:21 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:21 compute-2 podman[129706]: 2025-12-01 09:59:21.130433698 +0000 UTC m=+0.102033334 container init b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 09:59:21 compute-2 podman[129706]: 2025-12-01 09:59:21.136011407 +0000 UTC m=+0.107611023 container start b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 09:59:21 compute-2 bash[129706]: b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103
Dec 01 09:59:21 compute-2 podman[129706]: 2025-12-01 09:59:21.050675052 +0000 UTC m=+0.022274698 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:59:21 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:59:21 compute-2 sudo[129814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcxzkywhxruvehcocgtkwkzgjpcetgwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583160.369572-1366-85539872515758/AnsiballZ_copy.py'
Dec 01 09:59:21 compute-2 sudo[129814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:59:21 compute-2 python3.9[129823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583160.369572-1366-85539872515758/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:21 compute-2 sudo[129814]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:21 compute-2 ceph-mon[76053]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:21 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:22 compute-2 sudo[129989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrdeujegfrxdapayvuobauvcafipybs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583162.0560968-1418-63906984636976/AnsiballZ_file.py'
Dec 01 09:59:22 compute-2 sudo[129989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:22.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:22 compute-2 python3.9[129991]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:22 compute-2 sudo[129989]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:22 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:22.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:23 compute-2 sudo[130143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lltrtnfxgtykmyhmqpchihivbsubwxtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583163.0133607-1444-53497206448942/AnsiballZ_stat.py'
Dec 01 09:59:23 compute-2 sudo[130143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:23 compute-2 python3.9[130145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 09:59:23 compute-2 sudo[130143]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:23 compute-2 sudo[130266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-affhqbshkdkasscppbehatgheazzxvfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583163.0133607-1444-53497206448942/AnsiballZ_copy.py'
Dec 01 09:59:23 compute-2 sudo[130266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:23 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:23 compute-2 ceph-mon[76053]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:59:24 compute-2 python3.9[130268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583163.0133607-1444-53497206448942/.source.json _original_basename=.bboatplc follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:24 compute-2 sudo[130266]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:24.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:24 compute-2 sudo[130418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grbztsmxlokvlqhxejkshdjkvdsavcet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583164.3059878-1486-163966089111358/AnsiballZ_file.py'
Dec 01 09:59:24 compute-2 sudo[130418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:24 compute-2 python3.9[130420]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:24 compute-2 sudo[130418]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:24 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:24.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:59:25 compute-2 sudo[130572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyfffsjrqtwecqwsqnsdisdvlssrvoza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583165.5244043-1510-210503604585717/AnsiballZ_stat.py'
Dec 01 09:59:25 compute-2 sudo[130572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:25 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:26 compute-2 sudo[130572]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:26 compute-2 ceph-mon[76053]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:59:26 compute-2 sudo[130695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuobhgpzfecmjikktvvuqrehtnmjwolb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583165.5244043-1510-210503604585717/AnsiballZ_copy.py'
Dec 01 09:59:26 compute-2 sudo[130695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:26.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:26 compute-2 sudo[130695]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:26 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:26.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:27 compute-2 ceph-mon[76053]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:59:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:27 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 09:59:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:27 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 09:59:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:27 compute-2 sudo[130849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itpnhgcqjqfxabqzzywtpecdlzzjnteb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583167.2478423-1561-101761584318901/AnsiballZ_container_config_data.py'
Dec 01 09:59:27 compute-2 sudo[130849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:27 compute-2 python3.9[130851]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 01 09:59:27 compute-2 sudo[130849]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:27 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:28.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:28 compute-2 sudo[131001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqlatyfttwcducyrxkiepbqgvrtsyrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583168.1838086-1588-86948009720024/AnsiballZ_container_config_hash.py'
Dec 01 09:59:28 compute-2 sudo[131001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:28 compute-2 python3.9[131003]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 09:59:28 compute-2 sudo[131001]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:28 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:28.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:29 compute-2 sudo[131155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geggxyjocjykqoijxtbwqploxkyhgthk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583169.1223965-1615-278193986578425/AnsiballZ_podman_container_info.py'
Dec 01 09:59:29 compute-2 sudo[131155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:29 compute-2 python3.9[131157]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 09:59:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:29 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:29 compute-2 sudo[131155]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:30 compute-2 ceph-mon[76053]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 09:59:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:30.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:30 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:30.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:31 compute-2 ceph-mon[76053]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:59:31 compute-2 sudo[131335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byicfnsngyzawtpjlxqgwvrovgjcqbyz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583171.0712872-1654-152497561838679/AnsiballZ_edpm_container_manage.py'
Dec 01 09:59:31 compute-2 sudo[131335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:31 compute-2 python3[131337]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 09:59:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:31 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:32.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:32 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:32.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e88000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:33 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:34 compute-2 ceph-mon[76053]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:59:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:34 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:34.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:34 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e64000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:34 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:34.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:35 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:35 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:36 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:36 compute-2 ceph-mon[76053]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:59:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:36.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095936 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 09:59:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:36 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:36 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:37.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:37 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:37 compute-2 ceph-mon[76053]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 09:59:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:37 compute-2 podman[131350]: 2025-12-01 09:59:37.961486728 +0000 UTC m=+6.042042914 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:59:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:37 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:38 compute-2 sudo[131466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:59:38 compute-2 sudo[131466]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:59:38 compute-2 sudo[131466]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:38 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:38 compute-2 podman[131513]: 2025-12-01 09:59:38.073466069 +0000 UTC m=+0.026198457 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:59:38 compute-2 podman[131513]: 2025-12-01 09:59:38.253825389 +0000 UTC m=+0.206557767 container create 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:59:38 compute-2 python3[131337]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 01 09:59:38 compute-2 sudo[131335]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:38.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:38 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:38 compute-2 sudo[131698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppvvyttfkcbfpzoplssdyrbqaptiaqvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583178.570706-1678-238511006549650/AnsiballZ_stat.py'
Dec 01 09:59:38 compute-2 sudo[131698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:38 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:38 compute-2 python3.9[131700]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:59:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:39.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:39 compute-2 sudo[131698]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:39 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:39 compute-2 sudo[131854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjlbymmdjvwltgxwuybmyiqjdrbbssjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583179.3643374-1705-73263586608150/AnsiballZ_file.py'
Dec 01 09:59:39 compute-2 sudo[131854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:39 compute-2 python3.9[131856]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:39 compute-2 sudo[131854]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:39 compute-2 ceph-mon[76053]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 09:59:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:59:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:39 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:40 compute-2 sudo[131930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksjyqepdeyinibnesmjlxzhmrikncyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583179.3643374-1705-73263586608150/AnsiballZ_stat.py'
Dec 01 09:59:40 compute-2 sudo[131930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:40 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:40 compute-2 python3.9[131932]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 09:59:40 compute-2 sudo[131930]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:40.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:40 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:40 compute-2 sudo[132081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgpzvzurfierjgfpjaaoblwjlnphqpyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583180.2852104-1705-263102799295065/AnsiballZ_copy.py'
Dec 01 09:59:40 compute-2 sudo[132081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:40 compute-2 python3.9[132083]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583180.2852104-1705-263102799295065/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 09:59:40 compute-2 sudo[132081]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:40 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:41.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:41 compute-2 ceph-mon[76053]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 09:59:41 compute-2 sudo[132158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuhyoljtwyoxzzxepndaohykdunafopn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583180.2852104-1705-263102799295065/AnsiballZ_systemd.py'
Dec 01 09:59:41 compute-2 sudo[132158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:41 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:41 compute-2 python3.9[132160]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 09:59:41 compute-2 systemd[1]: Reloading.
Dec 01 09:59:41 compute-2 systemd-rc-local-generator[132192]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:59:41 compute-2 systemd-sysv-generator[132196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:59:41 compute-2 sudo[132158]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:41 compute-2 sudo[132272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkearjliwzbgatnktwbwhiphffaoxbey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583180.2852104-1705-263102799295065/AnsiballZ_systemd.py'
Dec 01 09:59:41 compute-2 sudo[132272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:41 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:42 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 09:59:42 compute-2 python3.9[132274]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 09:59:42 compute-2 systemd[1]: Reloading.
Dec 01 09:59:42 compute-2 systemd-sysv-generator[132306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:59:42 compute-2 systemd-rc-local-generator[132303]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:59:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:42.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:42 compute-2 kernel: ganesha.nfsd[131399]: segfault at 50 ip 00007f0f3584332e sp 00007f0eedffa210 error 4 in libntirpc.so.5.8[7f0f35828000+2c000] likely on CPU 5 (core 0, socket 5)
Dec 01 09:59:42 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 09:59:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:42 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy ignored for local
Dec 01 09:59:42 compute-2 systemd[1]: Started Process Core Dump (PID 132311/UID 0).
Dec 01 09:59:42 compute-2 systemd[1]: Starting ovn_controller container...
Dec 01 09:59:42 compute-2 systemd[1]: Started libcrun container.
Dec 01 09:59:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb10f5404b28f2cec93c40eee34dfa49d1ae8d96197b322b7c01146beb708e36/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:42 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9.
Dec 01 09:59:42 compute-2 podman[132316]: 2025-12-01 09:59:42.968547111 +0000 UTC m=+0.267929374 container init 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 09:59:42 compute-2 ovn_controller[132332]: + sudo -E kolla_set_configs
Dec 01 09:59:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:42 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:43 compute-2 podman[132316]: 2025-12-01 09:59:43.00332537 +0000 UTC m=+0.302707623 container start 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 09:59:43 compute-2 edpm-start-podman-container[132316]: ovn_controller
Dec 01 09:59:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:43.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:43 compute-2 systemd[1]: Created slice User Slice of UID 0.
Dec 01 09:59:43 compute-2 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 01 09:59:43 compute-2 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 01 09:59:43 compute-2 systemd[1]: Starting User Manager for UID 0...
Dec 01 09:59:43 compute-2 systemd[132361]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 01 09:59:43 compute-2 edpm-start-podman-container[132315]: Creating additional drop-in dependency for "ovn_controller" (0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9)
Dec 01 09:59:43 compute-2 podman[132339]: 2025-12-01 09:59:43.104740207 +0000 UTC m=+0.073600442 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 09:59:43 compute-2 systemd[1]: 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9-1bdb85a2c4a870b7.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 09:59:43 compute-2 systemd[1]: 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9-1bdb85a2c4a870b7.service: Failed with result 'exit-code'.
Dec 01 09:59:43 compute-2 systemd[1]: Reloading.
Dec 01 09:59:43 compute-2 systemd[132361]: Queued start job for default target Main User Target.
Dec 01 09:59:43 compute-2 systemd-rc-local-generator[132418]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 09:59:43 compute-2 systemd[132361]: Created slice User Application Slice.
Dec 01 09:59:43 compute-2 systemd[132361]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 01 09:59:43 compute-2 systemd[132361]: Started Daily Cleanup of User's Temporary Directories.
Dec 01 09:59:43 compute-2 systemd[132361]: Reached target Paths.
Dec 01 09:59:43 compute-2 systemd[132361]: Reached target Timers.
Dec 01 09:59:43 compute-2 systemd-sysv-generator[132421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 09:59:43 compute-2 systemd[132361]: Starting D-Bus User Message Bus Socket...
Dec 01 09:59:43 compute-2 systemd[132361]: Starting Create User's Volatile Files and Directories...
Dec 01 09:59:43 compute-2 systemd[132361]: Finished Create User's Volatile Files and Directories.
Dec 01 09:59:43 compute-2 systemd[132361]: Listening on D-Bus User Message Bus Socket.
Dec 01 09:59:43 compute-2 systemd[132361]: Reached target Sockets.
Dec 01 09:59:43 compute-2 systemd[132361]: Reached target Basic System.
Dec 01 09:59:43 compute-2 systemd[132361]: Reached target Main User Target.
Dec 01 09:59:43 compute-2 systemd[132361]: Startup finished in 151ms.
Dec 01 09:59:43 compute-2 systemd[1]: Started User Manager for UID 0.
Dec 01 09:59:43 compute-2 systemd[1]: Started ovn_controller container.
Dec 01 09:59:43 compute-2 sudo[132272]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:43 compute-2 systemd[1]: Started Session c1 of User root.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 09:59:43 compute-2 ovn_controller[132332]: INFO:__main__:Validating config file
Dec 01 09:59:43 compute-2 ovn_controller[132332]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 09:59:43 compute-2 ovn_controller[132332]: INFO:__main__:Writing out command to execute
Dec 01 09:59:43 compute-2 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: ++ cat /run_command
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + ARGS=
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + sudo kolla_copy_cacerts
Dec 01 09:59:43 compute-2 systemd[1]: Started Session c2 of User root.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + [[ ! -n '' ]]
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + . kolla_extend_start
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + umask 0022
Dec 01 09:59:43 compute-2 ovn_controller[132332]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 01 09:59:43 compute-2 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.5958] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.5964] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.5975] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.5980] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.5984] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 01 09:59:43 compute-2 kernel: br-int: entered promiscuous mode
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 01 09:59:43 compute-2 systemd-udevd[132464]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:59:43 compute-2 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.6417] manager: (ovn-9a0c85-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 01 09:59:43 compute-2 systemd-udevd[132467]: Network interface NamePolicy= disabled on kernel command line.
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.7404] device (genev_sys_6081): carrier: link connected
Dec 01 09:59:43 compute-2 NetworkManager[49132]: <info>  [1764583183.7406] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 01 09:59:43 compute-2 kernel: genev_sys_6081: entered promiscuous mode
Dec 01 09:59:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:43 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:44 compute-2 NetworkManager[49132]: <info>  [1764583184.1899] manager: (ovn-4d9738-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec 01 09:59:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:44.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:44 compute-2 systemd-coredump[132314]: Process 129772 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 54:
                                                    #0  0x00007f0f3584332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 09:59:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:44 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:45.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:45 compute-2 systemd[1]: systemd-coredump@4-132311-0.service: Deactivated successfully.
Dec 01 09:59:45 compute-2 systemd[1]: systemd-coredump@4-132311-0.service: Consumed 1.546s CPU time.
Dec 01 09:59:45 compute-2 podman[132474]: 2025-12-01 09:59:45.088037415 +0000 UTC m=+0.027259432 container died b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 09:59:45 compute-2 NetworkManager[49132]: <info>  [1764583185.2049] manager: (ovn-b99910-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec 01 09:59:45 compute-2 ceph-mon[76053]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:59:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24-merged.mount: Deactivated successfully.
Dec 01 09:59:45 compute-2 podman[132474]: 2025-12-01 09:59:45.45880442 +0000 UTC m=+0.398026447 container remove b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:59:45 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 09:59:45 compute-2 sudo[132630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcxtzodybndhxpushbkeexwnycdkclzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583185.2821424-1790-71285850714085/AnsiballZ_command.py'
Dec 01 09:59:45 compute-2 sudo[132630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:45 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 09:59:45 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.789s CPU time.
Dec 01 09:59:45 compute-2 python3.9[132635]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:45 compute-2 ovs-vsctl[132649]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 01 09:59:45 compute-2 sudo[132630]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:45 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:46 compute-2 ceph-mon[76053]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:59:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:46.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:46 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:47.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:47 compute-2 sudo[132800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqrzfrnknfnnrqpiqkbeyiluwqbvsjsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583186.7735307-1814-261470996707129/AnsiballZ_command.py'
Dec 01 09:59:47 compute-2 sudo[132800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:47 compute-2 python3.9[132802]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:47 compute-2 ovs-vsctl[132804]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 01 09:59:47 compute-2 sudo[132800]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:47 compute-2 ceph-mon[76053]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 09:59:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:47 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:48 compute-2 sudo[132956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrjobyqfhrbrvgqlwtsqrfitefmodgoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583187.874898-1855-61385881896105/AnsiballZ_command.py'
Dec 01 09:59:48 compute-2 sudo[132956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:48 compute-2 python3.9[132958]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 09:59:48 compute-2 ovs-vsctl[132959]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 01 09:59:48 compute-2 sudo[132956]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:48.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:48 compute-2 sshd-session[121203]: Connection closed by 192.168.122.30 port 41948
Dec 01 09:59:48 compute-2 sshd-session[121199]: pam_unix(sshd:session): session closed for user zuul
Dec 01 09:59:48 compute-2 systemd[1]: session-49.scope: Deactivated successfully.
Dec 01 09:59:48 compute-2 systemd[1]: session-49.scope: Consumed 56.170s CPU time.
Dec 01 09:59:48 compute-2 systemd-logind[795]: Session 49 logged out. Waiting for processes to exit.
Dec 01 09:59:48 compute-2 systemd-logind[795]: Removed session 49.
Dec 01 09:59:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:48 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:49.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:49 compute-2 ceph-mon[76053]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 09:59:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:49 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095950 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:59:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:50.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095950 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 09:59:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:50 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 09:59:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:51.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 09:59:51 compute-2 ceph-mon[76053]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 09:59:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:51 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:52.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:52 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:53.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:53 compute-2 systemd[1]: Stopping User Manager for UID 0...
Dec 01 09:59:53 compute-2 systemd[132361]: Activating special unit Exit the Session...
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped target Main User Target.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped target Basic System.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped target Paths.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped target Sockets.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped target Timers.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 01 09:59:53 compute-2 systemd[132361]: Closed D-Bus User Message Bus Socket.
Dec 01 09:59:53 compute-2 systemd[132361]: Stopped Create User's Volatile Files and Directories.
Dec 01 09:59:53 compute-2 systemd[132361]: Removed slice User Application Slice.
Dec 01 09:59:53 compute-2 systemd[132361]: Reached target Shutdown.
Dec 01 09:59:53 compute-2 systemd[132361]: Finished Exit the Session.
Dec 01 09:59:53 compute-2 systemd[132361]: Reached target Exit the Session.
Dec 01 09:59:53 compute-2 systemd[1]: user@0.service: Deactivated successfully.
Dec 01 09:59:53 compute-2 systemd[1]: Stopped User Manager for UID 0.
Dec 01 09:59:53 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 01 09:59:53 compute-2 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 01 09:59:53 compute-2 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 01 09:59:53 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 01 09:59:53 compute-2 systemd[1]: Removed slice User Slice of UID 0.
Dec 01 09:59:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:53 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:53 compute-2 ceph-mon[76053]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 09:59:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:54 compute-2 sshd-session[132993]: Accepted publickey for zuul from 192.168.122.30 port 32900 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 09:59:54 compute-2 systemd-logind[795]: New session 51 of user zuul.
Dec 01 09:59:54 compute-2 systemd[1]: Started Session 51 of User zuul.
Dec 01 09:59:54 compute-2 sshd-session[132993]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 09:59:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:54 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 09:59:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:55.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:55 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 5.
Dec 01 09:59:55 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:59:55 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.789s CPU time.
Dec 01 09:59:55 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 09:59:55 compute-2 python3.9[133148]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 09:59:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:55 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:56 compute-2 podman[133200]: 2025-12-01 09:59:56.075702343 +0000 UTC m=+0.061846407 container create b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 09:59:56 compute-2 ceph-mon[76053]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 09:59:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:56 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 09:59:56 compute-2 podman[133200]: 2025-12-01 09:59:56.035309143 +0000 UTC m=+0.021453227 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 09:59:56 compute-2 podman[133200]: 2025-12-01 09:59:56.136424532 +0000 UTC m=+0.122568616 container init b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 09:59:56 compute-2 podman[133200]: 2025-12-01 09:59:56.141320805 +0000 UTC m=+0.127464869 container start b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec 01 09:59:56 compute-2 bash[133200]: b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc
Dec 01 09:59:56 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 09:59:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:56.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:56 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:57.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:57 compute-2 ceph-mon[76053]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:59:57 compute-2 sudo[133408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aegyxejwzhlksztervunzqnkwpwrtoqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583196.698848-64-71274606758156/AnsiballZ_file.py'
Dec 01 09:59:57 compute-2 sudo[133408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:57 compute-2 python3.9[133410]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:57 compute-2 sudo[133408]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 09:59:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 09:59:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Cumulative writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s
                                           Interval WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 09:59:57 compute-2 sudo[133561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wheguwzbwtigqswqjaxbiiekgdcpldxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583197.5828538-64-144197293638859/AnsiballZ_file.py'
Dec 01 09:59:57 compute-2 sudo[133561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:57 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:58 compute-2 python3.9[133563]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:58 compute-2 sudo[133561]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:58 compute-2 sudo[133564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 09:59:58 compute-2 sudo[133564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 09:59:58 compute-2 sudo[133564]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:58 compute-2 sudo[133738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhehjucqlmzoexerrcjjjkegxtutmri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583198.229311-64-63006248910819/AnsiballZ_file.py'
Dec 01 09:59:58 compute-2 sudo[133738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 09:59:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:58.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 09:59:58 compute-2 python3.9[133740]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:58 compute-2 sudo[133738]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:58 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 09:59:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 09:59:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:59.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 09:59:59 compute-2 sudo[133891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeznmsonoperqqhkdhtflushcwanzana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583198.8298018-64-148911874229124/AnsiballZ_file.py'
Dec 01 09:59:59 compute-2 sudo[133891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:59 compute-2 python3.9[133893]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:59 compute-2 sudo[133891]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:59 compute-2 sudo[134044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etkzqigmvxzbropqjatoyftirkvyfjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583199.4817083-64-169861286973982/AnsiballZ_file.py'
Dec 01 09:59:59 compute-2 sudo[134044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 09:59:59 compute-2 ceph-mon[76053]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 09:59:59 compute-2 python3.9[134046]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 09:59:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:59 2025: (VI_0) received an invalid passwd!
Dec 01 09:59:59 compute-2 sudo[134044]: pam_unix(sudo:session): session closed for user root
Dec 01 09:59:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:00.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:01 compute-2 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Dec 01 10:00:01 compute-2 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec 01 10:00:01 compute-2 ceph-mon[76053]:      osd.2 observed slow operation indications in BlueStore
Dec 01 10:00:01 compute-2 ceph-mon[76053]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec 01 10:00:01 compute-2 ceph-mon[76053]:     daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0 is in unknown state
Dec 01 10:00:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:01.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:01 compute-2 python3.9[134197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 10:00:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:01 compute-2 sudo[134349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlbmgphqxozznaznhmolwltqszyrvlay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583201.5604181-196-156255027507873/AnsiballZ_seboolean.py'
Dec 01 10:00:01 compute-2 sudo[134349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:02 compute-2 ceph-mon[76053]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:00:02 compute-2 python3.9[134351]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 01 10:00:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:02 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:00:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:02 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:00:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:02.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:02 compute-2 sudo[134349]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:03.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:03 compute-2 ceph-mon[76053]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:00:03 compute-2 python3.9[134503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:04 compute-2 python3.9[134624]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583203.2552433-220-48913684827942/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:04 compute-2 sudo[134625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:00:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:04 compute-2 sudo[134625]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:04 compute-2 sudo[134625]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:04 compute-2 sudo[134658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:00:04 compute-2 sudo[134658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec 01 10:00:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:05.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec 01 10:00:05 compute-2 sudo[134658]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:05 compute-2 python3.9[134842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:05 compute-2 python3.9[134976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583204.8078763-265-144014725101655/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:05 compute-2 ceph-mon[76053]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:00:05 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:00:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:06 compute-2 sudo[135126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vahqfjkhtebgeiiapsnkcaepteaxswgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583206.2552557-316-254972414860538/AnsiballZ_setup.py'
Dec 01 10:00:06 compute-2 sudo[135126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:06 compute-2 python3.9[135128]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 10:00:06 compute-2 ceph-mon[76053]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 680 B/s wr, 2 op/s
Dec 01 10:00:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:07.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:07 compute-2 sudo[135126]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:07 compute-2 sudo[135212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkgrxcjwydpjnspzevjektguaahakkwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583206.2552557-316-254972414860538/AnsiballZ_dnf.py'
Dec 01 10:00:07 compute-2 sudo[135212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:07 compute-2 python3.9[135214]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 10:00:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:00:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:08.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c2c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:09 compute-2 ceph-mon[76053]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:00:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:09.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:09 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:10 compute-2 ceph-mon[76053]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:00:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:10 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c04000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100010 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:00:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:10.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:10 compute-2 sudo[135212]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100010 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:10 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:10 compute-2 sudo[135289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:00:10 compute-2 sudo[135289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:10 compute-2 sudo[135289]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec 01 10:00:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:11.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec 01 10:00:11 compute-2 sudo[135409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyronutlkrnqrcerihdiufwzkoesjhgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583210.7993884-352-32047505877017/AnsiballZ_systemd.py'
Dec 01 10:00:11 compute-2 sudo[135409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:11 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:11 compute-2 python3.9[135411]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:00:11 compute-2 sudo[135409]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:00:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:00:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:12 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:12.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:12 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:12 compute-2 python3.9[135564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:12 compute-2 ceph-mon[76053]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 582 B/s wr, 2 op/s
Dec 01 10:00:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:13.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:13 compute-2 ovn_controller[132332]: 2025-12-01T10:00:13Z|00025|memory|INFO|16256 kB peak resident set size after 29.6 seconds
Dec 01 10:00:13 compute-2 ovn_controller[132332]: 2025-12-01T10:00:13Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec 01 10:00:13 compute-2 podman[135660]: 2025-12-01 10:00:13.284470073 +0000 UTC m=+0.129476620 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 10:00:13 compute-2 python3.9[135699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583212.3548265-376-28422687109561/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:13 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:13 compute-2 python3.9[135863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:14 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:14 compute-2 python3.9[135984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583213.5457053-376-110997093366049/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec 01 10:00:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:14.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec 01 10:00:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:14 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:14 compute-2 ceph-mon[76053]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 582 B/s wr, 2 op/s
Dec 01 10:00:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:15.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:15 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:16 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:16 compute-2 python3.9[136136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec 01 10:00:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:16.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec 01 10:00:16 compute-2 python3.9[136257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583215.6839511-508-58925896698135/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:16 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:16 compute-2 ceph-mon[76053]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 582 B/s wr, 2 op/s
Dec 01 10:00:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec 01 10:00:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:17.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec 01 10:00:17 compute-2 python3.9[136408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:17 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:17 compute-2 python3.9[136530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583216.7908309-508-159558967663340/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:18 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:18 compute-2 sudo[136555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:00:18 compute-2 sudo[136555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:18 compute-2 sudo[136555]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:18.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:18 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy ignored for local
Dec 01 10:00:18 compute-2 kernel: ganesha.nfsd[135230]: segfault at 50 ip 00007f0cd693232e sp 00007f0c8affc210 error 4 in libntirpc.so.5.8[7f0cd6917000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 01 10:00:18 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:00:18 compute-2 systemd[1]: Started Process Core Dump (PID 136695/UID 0).
Dec 01 10:00:18 compute-2 python3.9[136707]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:00:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:19.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:19 compute-2 ceph-mon[76053]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 511 B/s wr, 2 op/s
Dec 01 10:00:19 compute-2 sudo[136861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jecbjqcgajorwougviuljuzyoqnacetk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583219.202683-622-34944663261003/AnsiballZ_file.py'
Dec 01 10:00:19 compute-2 sudo[136861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:19 compute-2 python3.9[136863]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:19 compute-2 sudo[136861]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:20 compute-2 sudo[137013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evceznfjbjvjelwflhceuisbjqmupwit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583219.9409852-646-260670387814863/AnsiballZ_stat.py'
Dec 01 10:00:20 compute-2 sudo[137013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:20 compute-2 ceph-mon[76053]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:00:20 compute-2 systemd-coredump[136706]: Process 133228 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 54:
                                                    #0  0x00007f0cd693232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:00:20 compute-2 python3.9[137015]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:20 compute-2 sudo[137013]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:20 compute-2 systemd[1]: systemd-coredump@5-136695-0.service: Deactivated successfully.
Dec 01 10:00:20 compute-2 systemd[1]: systemd-coredump@5-136695-0.service: Consumed 1.636s CPU time.
Dec 01 10:00:20 compute-2 podman[137022]: 2025-12-01 10:00:20.494340625 +0000 UTC m=+0.029603056 container died b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 01 10:00:20 compute-2 systemd[1]: var-lib-containers-storage-overlay-ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452-merged.mount: Deactivated successfully.
Dec 01 10:00:20 compute-2 podman[137022]: 2025-12-01 10:00:20.540798527 +0000 UTC m=+0.076060918 container remove b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 01 10:00:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:00:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:20 compute-2 sudo[137129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnbzynzgnxshqyebwhyrfosfoxnypfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583219.9409852-646-260670387814863/AnsiballZ_file.py'
Dec 01 10:00:20 compute-2 sudo[137129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:00:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.635s CPU time.
Dec 01 10:00:20 compute-2 python3.9[137137]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:20 compute-2 sudo[137129]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:21.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:21 compute-2 sudo[137290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spjzcxzyfyhmhyhbbmjcytnhvcmuyrlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583221.0256653-646-118199704646485/AnsiballZ_stat.py'
Dec 01 10:00:21 compute-2 sudo[137290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:21 compute-2 python3.9[137292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:21 compute-2 sudo[137290]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:21 compute-2 sudo[137368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwlcmkxgkcvndtovghjhbvfmzaqmcwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583221.0256653-646-118199704646485/AnsiballZ_file.py'
Dec 01 10:00:21 compute-2 sudo[137368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:21 compute-2 python3.9[137370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:21 compute-2 sudo[137368]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:22 compute-2 ceph-mon[76053]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:00:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:22 compute-2 sudo[137521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frfibxhazsieckegdyhpvhaxsztbuewb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583222.5602214-715-39666091975182/AnsiballZ_file.py'
Dec 01 10:00:22 compute-2 sudo[137521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec 01 10:00:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:23.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec 01 10:00:23 compute-2 python3.9[137523]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:23 compute-2 sudo[137521]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:23 compute-2 sudo[137674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cegmmwfkqppcpeuayffnmfiskyhbylui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583223.4428-740-173155006480092/AnsiballZ_stat.py'
Dec 01 10:00:23 compute-2 sudo[137674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:23 compute-2 python3.9[137676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:23 compute-2 sudo[137674]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:24 compute-2 sudo[137752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iykixhyyjvyfeelnokmbeubrczzphaoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583223.4428-740-173155006480092/AnsiballZ_file.py'
Dec 01 10:00:24 compute-2 sudo[137752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:24 compute-2 python3.9[137754]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:24 compute-2 sudo[137752]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:24 compute-2 ceph-mon[76053]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:00:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100024 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:00:24 compute-2 sudo[137905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snoizyzyvmgffrczwmyogndcwuidaadg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583224.6620831-775-263279625673615/AnsiballZ_stat.py'
Dec 01 10:00:24 compute-2 sudo[137905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:25.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:25 compute-2 python3.9[137907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:25 compute-2 sudo[137905]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:25 compute-2 sudo[137984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cucdqvjjwkxhqahbtpkufnuawazddwbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583224.6620831-775-263279625673615/AnsiballZ_file.py'
Dec 01 10:00:25 compute-2 sudo[137984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:25 compute-2 python3.9[137986]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:25 compute-2 sudo[137984]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:00:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:26.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:27.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:27 compute-2 sudo[138137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixrecimoesdyibfniupqeyothnowbnsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583226.8364115-811-270072430839161/AnsiballZ_systemd.py'
Dec 01 10:00:27 compute-2 sudo[138137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:27 compute-2 ceph-mon[76053]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:00:27 compute-2 python3.9[138139]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:00:27 compute-2 systemd[1]: Reloading.
Dec 01 10:00:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:27 compute-2 systemd-rc-local-generator[138166]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:00:27 compute-2 systemd-sysv-generator[138169]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:00:27 compute-2 sudo[138137]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:28 compute-2 ceph-mon[76053]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:00:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:28.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:29.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:29 compute-2 sudo[138328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foywuubdyzifvurpexezhdshbidcdlpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583229.1482303-835-14512206048293/AnsiballZ_stat.py'
Dec 01 10:00:29 compute-2 sudo[138328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:29 compute-2 python3.9[138330]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:29 compute-2 sudo[138328]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:29 compute-2 sudo[138406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcdzfutqmcvrgpzeiqjkqgbbavjhyhym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583229.1482303-835-14512206048293/AnsiballZ_file.py'
Dec 01 10:00:29 compute-2 sudo[138406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:30 compute-2 python3.9[138408]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:30 compute-2 sudo[138406]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:30 compute-2 ceph-mon[76053]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:00:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:00:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:00:30 compute-2 sudo[138558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypskxpudzezwxhekalzuzvypmaoicokb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583230.3855789-871-84199450847681/AnsiballZ_stat.py'
Dec 01 10:00:30 compute-2 sudo[138558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:30 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 6.
Dec 01 10:00:30 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:00:30 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.635s CPU time.
Dec 01 10:00:30 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:00:30 compute-2 python3.9[138560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:30 compute-2 sudo[138558]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:31 compute-2 podman[138633]: 2025-12-01 10:00:31.052687714 +0000 UTC m=+0.040902874 container create 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 10:00:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:31.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:31 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:00:31 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:00:31 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:00:31 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:00:31 compute-2 sudo[138701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwfjkiozwjcnpqqrnrkocyknyffutlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583230.3855789-871-84199450847681/AnsiballZ_file.py'
Dec 01 10:00:31 compute-2 podman[138633]: 2025-12-01 10:00:31.12180646 +0000 UTC m=+0.110021640 container init 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:00:31 compute-2 sudo[138701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:31 compute-2 podman[138633]: 2025-12-01 10:00:31.129623962 +0000 UTC m=+0.117839122 container start 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 10:00:31 compute-2 podman[138633]: 2025-12-01 10:00:31.033836852 +0000 UTC m=+0.022052042 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:00:31 compute-2 bash[138633]: 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:00:31 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:00:31 compute-2 python3.9[138704]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:31 compute-2 sudo[138701]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:31 compute-2 sudo[138893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdgetdkyuomjkdjlqiudmvucvuwdnnay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583231.6827672-908-44248917734901/AnsiballZ_systemd.py'
Dec 01 10:00:31 compute-2 sudo[138893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:32 compute-2 python3.9[138895]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:00:32 compute-2 systemd[1]: Reloading.
Dec 01 10:00:32 compute-2 systemd-rc-local-generator[138926]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:00:32 compute-2 systemd-sysv-generator[138929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:00:32 compute-2 ceph-mon[76053]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:00:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:32 compute-2 systemd[1]: Starting Create netns directory...
Dec 01 10:00:32 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 10:00:32 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 10:00:32 compute-2 systemd[1]: Finished Create netns directory.
Dec 01 10:00:32 compute-2 sudo[138893]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:33 compute-2 sudo[139091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyfpwiwxosfilzdkwfjmejnpyzogtumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583233.1078246-938-154000341003356/AnsiballZ_file.py'
Dec 01 10:00:33 compute-2 sudo[139091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:33 compute-2 python3.9[139093]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:33 compute-2 sudo[139091]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:34 compute-2 sudo[139243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rresejbjlyapmzjvrheemewxdcibdxqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583233.8837721-962-152254643033791/AnsiballZ_stat.py'
Dec 01 10:00:34 compute-2 sudo[139243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:34 compute-2 python3.9[139245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:34 compute-2 sudo[139243]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:34 compute-2 ceph-mon[76053]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:00:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:34.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:34 compute-2 sudo[139366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiklskkxysvslwvunllhbwwueafzomzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583233.8837721-962-152254643033791/AnsiballZ_copy.py'
Dec 01 10:00:34 compute-2 sudo[139366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:34 compute-2 python3.9[139368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583233.8837721-962-152254643033791/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:34 compute-2 sudo[139366]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:35.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:35 compute-2 sudo[139520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgyruzbbtlhiywhlohbydnvptisggqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583235.5236645-1013-42208853072989/AnsiballZ_file.py'
Dec 01 10:00:35 compute-2 sudo[139520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:35 compute-2 python3.9[139522]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:00:36 compute-2 sudo[139520]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:36.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:36 compute-2 ceph-mon[76053]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:00:36 compute-2 sudo[139672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqltpgbnblobbeyozfywdrvmkqrpcrqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583236.3685699-1036-266070992127625/AnsiballZ_stat.py'
Dec 01 10:00:36 compute-2 sudo[139672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:36 compute-2 python3.9[139674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:00:36 compute-2 sudo[139672]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:37.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:37 compute-2 sudo[139796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uahtovduiogvudfuiyvikhcqkjicldqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583236.3685699-1036-266070992127625/AnsiballZ_copy.py'
Dec 01 10:00:37 compute-2 sudo[139796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:37 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:00:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:37 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:00:37 compute-2 python3.9[139798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583236.3685699-1036-266070992127625/.source.json _original_basename=.yv41g5uy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:37 compute-2 sudo[139796]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:37 compute-2 sudo[139949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dedpvdbzvthfmmkesurwvpxjwcahrxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583237.7146041-1081-237390581365495/AnsiballZ_file.py'
Dec 01 10:00:37 compute-2 sudo[139949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:38 compute-2 python3.9[139951]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:38 compute-2 sudo[139949]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:38 compute-2 sudo[139954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:00:38 compute-2 sudo[139954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:38 compute-2 sudo[139954]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:38.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:38 compute-2 ceph-mon[76053]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 01 10:00:38 compute-2 sudo[140126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fphlivxhlekkwrpjjgozaodicwkvlyzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583238.4667785-1105-173377453074315/AnsiballZ_stat.py'
Dec 01 10:00:38 compute-2 sudo[140126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:38 compute-2 sudo[140126]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:00:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:39.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:00:39 compute-2 sudo[140251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adxyioszhnentawggzgecfgjjiplneur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583238.4667785-1105-173377453074315/AnsiballZ_copy.py'
Dec 01 10:00:39 compute-2 sudo[140251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:39 compute-2 sudo[140251]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:00:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:40 compute-2 sudo[140403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oigwqvsnrlcyjffceuypzqtxvrkpnvdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583239.961791-1156-264940971825128/AnsiballZ_container_config_data.py'
Dec 01 10:00:40 compute-2 sudo[140403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:40 compute-2 python3.9[140405]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 01 10:00:40 compute-2 sudo[140403]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:40.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:40 compute-2 ceph-mon[76053]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 01 10:00:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:41.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:41 compute-2 sudo[140556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eabfdhxzfmnznsswddyigwydjlfctely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583240.8657565-1183-114241648695608/AnsiballZ_container_config_hash.py'
Dec 01 10:00:41 compute-2 sudo[140556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:41 compute-2 python3.9[140558]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 10:00:41 compute-2 sudo[140556]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:42 compute-2 sudo[140709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diiiecircinzesqtncdxnbmdhoghcecs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583241.7839863-1210-221034101223524/AnsiballZ_podman_container_info.py'
Dec 01 10:00:42 compute-2 sudo[140709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:42 compute-2 ceph-mon[76053]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:00:42 compute-2 python3.9[140711]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 10:00:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:00:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:42.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:00:42 compute-2 sudo[140709]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:43.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:43 compute-2 podman[140766]: 2025-12-01 10:00:43.454808371 +0000 UTC m=+0.099824190 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:44 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7260000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:44 compute-2 sudo[140930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcinvmcbtaisrigykmahpnvtrwaxywvn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583243.6398375-1249-166818486564513/AnsiballZ_edpm_container_manage.py'
Dec 01 10:00:44 compute-2 sudo[140930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:44 compute-2 python3[140934]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 10:00:44 compute-2 ceph-mon[76053]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:00:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:44.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:44 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:45.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:45 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:46 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:46 compute-2 ceph-mon[76053]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:00:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:46.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100046 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:00:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:46 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:47.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:47 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:48 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:48.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:48 compute-2 ceph-mon[76053]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:00:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:48 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:49.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:49 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:50 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:50.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:50 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:50 compute-2 ceph-mon[76053]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:00:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:51.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:51 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:52 compute-2 ceph-mon[76053]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:00:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:52 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:52 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:53.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:53 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:54 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:54.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:54 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:55.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:55 compute-2 ceph-mon[76053]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:00:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:00:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:55 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:56 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:56 compute-2 ceph-mon[76053]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:00:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:56.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:56 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:00:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:57.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:00:57 compute-2 podman[140947]: 2025-12-01 10:00:57.158235409 +0000 UTC m=+12.629355352 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 podman[141082]: 2025-12-01 10:00:57.316851231 +0000 UTC m=+0.051427292 container create 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 01 10:00:57 compute-2 podman[141082]: 2025-12-01 10:00:57.288829673 +0000 UTC m=+0.023405734 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 10:00:57 compute-2 python3[140934]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec 01 10:00:57 compute-2 sudo[140930]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:57 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:00:57 compute-2 sudo[141271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jskcpgbvxhhbrvqwciwhqenkhvcshcpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583257.6612484-1273-269293646685349/AnsiballZ_stat.py'
Dec 01 10:00:57 compute-2 sudo[141271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:58 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:58 compute-2 python3.9[141273]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:00:58 compute-2 sudo[141271]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:58 compute-2 sudo[141300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:00:58 compute-2 sudo[141300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:00:58 compute-2 sudo[141300]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:58.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:58 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:58 compute-2 ceph-mon[76053]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:00:58 compute-2 sudo[141450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgemzyuoddapfwkdspnymcsjlgtbtyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583258.5241282-1300-243616982352761/AnsiballZ_file.py'
Dec 01 10:00:58 compute-2 sudo[141450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:58 compute-2 python3.9[141452]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:00:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:58 compute-2 sudo[141450]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:00:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:00:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:59.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:00:59 compute-2 sudo[141527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bibkdflvpzgdrmyofzigaihqzfbjdclb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583258.5241282-1300-243616982352761/AnsiballZ_stat.py'
Dec 01 10:00:59 compute-2 sudo[141527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:59 compute-2 python3.9[141529]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:00:59 compute-2 sudo[141527]: pam_unix(sudo:session): session closed for user root
Dec 01 10:00:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:59 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:00:59 compute-2 sudo[141679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjznjruendsgwpymbxxxjpadvxbwtunc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583259.4885793-1300-155072760256525/AnsiballZ_copy.py'
Dec 01 10:00:59 compute-2 sudo[141679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:00:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:00:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:00 compute-2 python3.9[141681]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583259.4885793-1300-155072760256525/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:00 compute-2 sudo[141679]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:00 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:00 compute-2 sudo[141755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aspdyntmefqestjvleljgwsuukqzblex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583259.4885793-1300-155072760256525/AnsiballZ_systemd.py'
Dec 01 10:01:00 compute-2 sudo[141755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:00 compute-2 ceph-mon[76053]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:01:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:00.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:00 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:00 compute-2 python3.9[141757]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:01:00 compute-2 systemd[1]: Reloading.
Dec 01 10:01:00 compute-2 systemd-rc-local-generator[141784]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:01:00 compute-2 systemd-sysv-generator[141787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:01:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:01 compute-2 sudo[141755]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:01.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:01 compute-2 sudo[141869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbdcccomxkxlrvsiydjpafnpemdkpabf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583259.4885793-1300-155072760256525/AnsiballZ_systemd.py'
Dec 01 10:01:01 compute-2 sudo[141869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:01 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:01 compute-2 python3.9[141871]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:01 compute-2 systemd[1]: Reloading.
Dec 01 10:01:01 compute-2 systemd-rc-local-generator[141901]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:01:01 compute-2 systemd-sysv-generator[141904]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:01:01 compute-2 CROND[141910]: (root) CMD (run-parts /etc/cron.hourly)
Dec 01 10:01:01 compute-2 run-parts[141913]: (/etc/cron.hourly) starting 0anacron
Dec 01 10:01:01 compute-2 anacron[141921]: Anacron started on 2025-12-01
Dec 01 10:01:01 compute-2 anacron[141921]: Will run job `cron.daily' in 20 min.
Dec 01 10:01:01 compute-2 anacron[141921]: Will run job `cron.weekly' in 40 min.
Dec 01 10:01:01 compute-2 anacron[141921]: Will run job `cron.monthly' in 60 min.
Dec 01 10:01:01 compute-2 anacron[141921]: Jobs will be executed sequentially
Dec 01 10:01:01 compute-2 run-parts[141923]: (/etc/cron.hourly) finished 0anacron
Dec 01 10:01:01 compute-2 CROND[141909]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 01 10:01:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:02 compute-2 systemd[1]: Starting ovn_metadata_agent container...
Dec 01 10:01:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:02 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:02 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:01:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed2a5a36d0f041c6a7ee2fb88b3c36c0eb9f74257202321965afb3d8371e9272/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:02 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed2a5a36d0f041c6a7ee2fb88b3c36c0eb9f74257202321965afb3d8371e9272/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:02 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44.
Dec 01 10:01:02 compute-2 podman[141928]: 2025-12-01 10:01:02.403786369 +0000 UTC m=+0.371793634 container init 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + sudo -E kolla_set_configs
Dec 01 10:01:02 compute-2 podman[141928]: 2025-12-01 10:01:02.4278328 +0000 UTC m=+0.395840035 container start 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:01:02 compute-2 edpm-start-podman-container[141928]: ovn_metadata_agent
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Validating config file
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Copying service configuration files
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Writing out command to execute
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: ++ cat /run_command
Dec 01 10:01:02 compute-2 edpm-start-podman-container[141927]: Creating additional drop-in dependency for "ovn_metadata_agent" (8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44)
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + CMD=neutron-ovn-metadata-agent
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + ARGS=
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + sudo kolla_copy_cacerts
Dec 01 10:01:02 compute-2 systemd[1]: Reloading.
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + [[ ! -n '' ]]
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + . kolla_extend_start
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: Running command: 'neutron-ovn-metadata-agent'
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + umask 0022
Dec 01 10:01:02 compute-2 ovn_metadata_agent[141944]: + exec neutron-ovn-metadata-agent
Dec 01 10:01:02 compute-2 podman[141951]: 2025-12-01 10:01:02.533989794 +0000 UTC m=+0.090469850 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:01:02 compute-2 systemd-rc-local-generator[142023]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:01:02 compute-2 systemd-sysv-generator[142027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:01:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:02.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:02 compute-2 ceph-mon[76053]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:02 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:02 compute-2 systemd[1]: Started ovn_metadata_agent container.
Dec 01 10:01:02 compute-2 sudo[141869]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:01:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:03.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:01:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:03 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:03 compute-2 sshd-session[132997]: Connection closed by 192.168.122.30 port 32900
Dec 01 10:01:03 compute-2 sshd-session[132993]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:01:03 compute-2 systemd[1]: session-51.scope: Deactivated successfully.
Dec 01 10:01:03 compute-2 systemd[1]: session-51.scope: Consumed 1min 499ms CPU time.
Dec 01 10:01:03 compute-2 systemd-logind[795]: Session 51 logged out. Waiting for processes to exit.
Dec 01 10:01:03 compute-2 systemd-logind[795]: Removed session 51.
Dec 01 10:01:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:04 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.638 141949 INFO neutron.common.config [-] Logging enabled!
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:04.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.678 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.678 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 01 10:01:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:04 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.689 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 01 10:01:04 compute-2 ceph-mon[76053]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.705 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 968d9d26-f45d-4d49-addd-0befc9c8f4a3 (UUID: 968d9d26-f45d-4d49-addd-0befc9c8f4a3) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.738 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.745 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.753 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '968d9d26-f45d-4d49-addd-0befc9c8f4a3'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7ffb887dba00>], external_ids={}, name=968d9d26-f45d-4d49-addd-0befc9c8f4a3, nb_cfg_timestamp=1764583191623, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.754 141949 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7ffb887dec10>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.755 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.755 141949 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.756 141949 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.756 141949 INFO oslo_service.service [-] Starting 1 workers
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.761 141949 DEBUG oslo_service.service [-] Started child 142061 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.765 142061 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-890601'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.766 141949 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6qyqxi4b/privsep.sock']
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.796 142061 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.802 142061 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 01 10:01:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.808 142061 INFO eventlet.wsgi.server [-] (142061) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 01 10:01:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:05.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:05 compute-2 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.479 141949 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.480 141949 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6qyqxi4b/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.338 142068 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.343 142068 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.345 142068 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.345 142068 INFO oslo.privsep.daemon [-] privsep daemon running as pid 142068
Dec 01 10:01:05 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.483 142068 DEBUG oslo.privsep.daemon [-] privsep: reply[2474a921-f865-431f-80d7-2be0f08f54af]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 10:01:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:05 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.041 142068 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.042 142068 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.042 142068 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:01:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:06 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.649 142068 DEBUG oslo.privsep.daemon [-] privsep: reply[c04e4296-1167-4fc3-8859-b2b43a502694]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.651 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, column=external_ids, values=({'neutron:ovn-metadata-id': '1277b2fd-192f-596e-a0e9-42ef74e2b28e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.658 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:06.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.688 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.688 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:06 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy ignored for local
Dec 01 10:01:06 compute-2 kernel: ganesha.nfsd[140807]: segfault at 50 ip 00007f730f25c32e sp 00007f72de7fb210 error 4 in libntirpc.so.5.8[7f730f241000+2c000] likely on CPU 5 (core 0, socket 5)
Dec 01 10:01:06 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:01:06 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 10:01:06 compute-2 ceph-mon[76053]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:06 compute-2 systemd[1]: Started Process Core Dump (PID 142073/UID 0).
Dec 01 10:01:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:07.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:08.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:09 compute-2 sshd-session[142079]: Accepted publickey for zuul from 192.168.122.30 port 51380 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:01:09 compute-2 systemd-logind[795]: New session 52 of user zuul.
Dec 01 10:01:09 compute-2 systemd[1]: Started Session 52 of User zuul.
Dec 01 10:01:09 compute-2 sshd-session[142079]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:01:09 compute-2 ceph-mon[76053]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:09 compute-2 systemd-coredump[142074]: Process 138705 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 42:
                                                    #0  0x00007f730f25c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:01:09 compute-2 systemd[1]: systemd-coredump@6-142073-0.service: Deactivated successfully.
Dec 01 10:01:09 compute-2 systemd[1]: systemd-coredump@6-142073-0.service: Consumed 2.680s CPU time.
Dec 01 10:01:09 compute-2 podman[142139]: 2025-12-01 10:01:09.820780463 +0000 UTC m=+0.029350065 container died 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:01:09 compute-2 systemd[1]: var-lib-containers-storage-overlay-6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac-merged.mount: Deactivated successfully.
Dec 01 10:01:09 compute-2 podman[142139]: 2025-12-01 10:01:09.864635324 +0000 UTC m=+0.073204916 container remove 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec 01 10:01:09 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:01:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:10 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:01:10 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.343s CPU time.
Dec 01 10:01:10 compute-2 ceph-mon[76053]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:01:10 compute-2 python3.9[142278]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 10:01:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:10.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:11 compute-2 sudo[142307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:01:11 compute-2 sudo[142307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:11 compute-2 sudo[142307]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:11 compute-2 sudo[142332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:01:11 compute-2 sudo[142332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:11 compute-2 sudo[142511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqlukwiupzckiwhzukzpewtbgjscjrhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583271.1578586-65-20755938536747/AnsiballZ_command.py'
Dec 01 10:01:11 compute-2 sudo[142511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:11 compute-2 sudo[142332]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:11 compute-2 python3.9[142517]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:11 compute-2 sudo[142511]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:12.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:13.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:13 compute-2 sudo[142681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbfdagosccqithxdrcynajodmlbvvtft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583272.7038054-98-77585503960038/AnsiballZ_systemd_service.py'
Dec 01 10:01:13 compute-2 sudo[142681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:13 compute-2 ceph-mon[76053]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Dec 01 10:01:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:01:13 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:01:13 compute-2 ceph-mon[76053]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 290 B/s rd, 0 op/s
Dec 01 10:01:13 compute-2 python3.9[142683]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:01:13 compute-2 systemd[1]: Reloading.
Dec 01 10:01:13 compute-2 systemd-sysv-generator[142734]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:01:13 compute-2 systemd-rc-local-generator[142730]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:01:13 compute-2 podman[142686]: 2025-12-01 10:01:13.873837481 +0000 UTC m=+0.086100207 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:01:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:14 compute-2 sudo[142681]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:01:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:01:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:01:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:01:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:01:14 compute-2 ceph-mon[76053]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 290 B/s rd, 0 op/s
Dec 01 10:01:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:14.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100114 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:01:14 compute-2 python3.9[142894]: ansible-ansible.builtin.service_facts Invoked
Dec 01 10:01:14 compute-2 network[142911]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 10:01:14 compute-2 network[142912]: 'network-scripts' will be removed from distribution in near future.
Dec 01 10:01:14 compute-2 network[142913]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 10:01:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:15.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:16 compute-2 ceph-mon[76053]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 386 B/s rd, 0 op/s
Dec 01 10:01:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:16.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:17.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100117 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:01:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:18 compute-2 ceph-mon[76053]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 B/s rd, 0 op/s
Dec 01 10:01:18 compute-2 sudo[142983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:01:18 compute-2 sudo[142983]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:18 compute-2 sudo[142983]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:18.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 10:01:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:19.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 10:01:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 7.
Dec 01 10:01:20 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:01:20 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.343s CPU time.
Dec 01 10:01:20 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:01:20 compute-2 sudo[143182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:01:20 compute-2 sudo[143182]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:20 compute-2 sudo[143182]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:20 compute-2 podman[143246]: 2025-12-01 10:01:20.330771532 +0000 UTC m=+0.046340925 container create 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 10:01:20 compute-2 sudo[143285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpdwcvdfohbosrcoycoybeshtllmxkyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583280.0523615-155-184666966968453/AnsiballZ_systemd_service.py'
Dec 01 10:01:20 compute-2 sudo[143285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:20 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:20 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:20 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:20 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:20 compute-2 podman[143246]: 2025-12-01 10:01:20.397190148 +0000 UTC m=+0.112759561 container init 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:01:20 compute-2 podman[143246]: 2025-12-01 10:01:20.40303215 +0000 UTC m=+0.118601543 container start 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec 01 10:01:20 compute-2 podman[143246]: 2025-12-01 10:01:20.309570808 +0000 UTC m=+0.025140221 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:01:20 compute-2 bash[143246]: 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c
Dec 01 10:01:20 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:01:20 compute-2 python3.9[143287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:20.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:20 compute-2 sudo[143285]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:01:20 compute-2 ceph-mon[76053]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 B/s rd, 0 op/s
Dec 01 10:01:20 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:01:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:21.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:21 compute-2 sudo[143484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ampdlwbzyexpgkofhatafpalggvndfzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583280.9647155-155-207900986586597/AnsiballZ_systemd_service.py'
Dec 01 10:01:21 compute-2 sudo[143484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:21 compute-2 python3.9[143487]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:21 compute-2 sudo[143484]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:22 compute-2 ceph-mon[76053]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 483 B/s rd, 96 B/s wr, 0 op/s
Dec 01 10:01:22 compute-2 sudo[143638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbecaxldyikhxktdekdjiudimytvdamw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583281.7499578-155-53557929039980/AnsiballZ_systemd_service.py'
Dec 01 10:01:22 compute-2 sudo[143638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:22 compute-2 python3.9[143640]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:22 compute-2 sudo[143638]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:22.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:23 compute-2 sudo[143791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqixqjbhimthuykixsvokiekocnqiqpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583282.7675545-155-88153795997066/AnsiballZ_systemd_service.py'
Dec 01 10:01:23 compute-2 sudo[143791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:23.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:23 compute-2 python3.9[143793]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:23 compute-2 sudo[143791]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:23 compute-2 sudo[143946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozxuwtrdvhwczmkmdhvhrgzxcemutomu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583283.5273414-155-201825444008941/AnsiballZ_systemd_service.py'
Dec 01 10:01:23 compute-2 sudo[143946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:24 compute-2 python3.9[143948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:24 compute-2 sudo[143946]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:24 compute-2 ceph-mon[76053]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:01:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:01:24 compute-2 sudo[144099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alkichoaxxjrrdsdtarpyuqbrgxydcfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583284.390801-155-112870320047740/AnsiballZ_systemd_service.py'
Dec 01 10:01:24 compute-2 sudo[144099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:24.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:25 compute-2 python3.9[144101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:25 compute-2 sudo[144099]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:25.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:25 compute-2 sudo[144254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiagxetsatmlglzxdnthljhtjnjwjrll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583285.2002723-155-183126306320791/AnsiballZ_systemd_service.py'
Dec 01 10:01:25 compute-2 sudo[144254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:25 compute-2 python3.9[144256]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:01:25 compute-2 sudo[144254]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:26 compute-2 ceph-mon[76053]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:01:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:26 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:01:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:26 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:01:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:26.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:27.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:27 compute-2 sudo[144409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddumadxluvqmqluzcvrknwqqqvuxknnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583287.119126-311-222070154371435/AnsiballZ_file.py'
Dec 01 10:01:27 compute-2 sudo[144409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:27 compute-2 python3.9[144411]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:27 compute-2 sudo[144409]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:28 compute-2 ceph-mon[76053]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:01:28 compute-2 sudo[144561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekzdvpylwxndwxgcrnoexxvtlyehphby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583287.8818886-311-21227449507047/AnsiballZ_file.py'
Dec 01 10:01:28 compute-2 sudo[144561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:28 compute-2 python3.9[144563]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:28 compute-2 sudo[144561]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:28.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:28 compute-2 sudo[144713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwaihxlmcxoywzkhqvqjwonvfwcktbwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583288.5092902-311-124578399184899/AnsiballZ_file.py'
Dec 01 10:01:28 compute-2 sudo[144713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:29 compute-2 python3.9[144715]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:29 compute-2 sudo[144713]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:29.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:29 compute-2 sudo[144867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pukiplxzrcmffweooncqpxtrpnabogkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583289.172524-311-58207836413405/AnsiballZ_file.py'
Dec 01 10:01:29 compute-2 sudo[144867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:29 compute-2 python3.9[144869]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:29 compute-2 sudo[144867]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:30 compute-2 ceph-mon[76053]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:01:30 compute-2 sudo[145019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwxuwcpgosrycawscsmcwjpcwylmywqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583289.8419673-311-209664063551794/AnsiballZ_file.py'
Dec 01 10:01:30 compute-2 sudo[145019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:30 compute-2 python3.9[145021]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:30 compute-2 sudo[145019]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:30.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:30 compute-2 sudo[145171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mccmbszjxztergulnbmgxeliohtigxce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583290.447846-311-217004921337038/AnsiballZ_file.py'
Dec 01 10:01:30 compute-2 sudo[145171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:31 compute-2 python3.9[145173]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:31 compute-2 sudo[145171]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 10:01:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:31.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 10:01:31 compute-2 sudo[145325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhvwowmtavoirmpsvqsvigkdxezzvgpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583291.22876-311-125817668977910/AnsiballZ_file.py'
Dec 01 10:01:31 compute-2 sudo[145325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:31 compute-2 python3.9[145327]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:31 compute-2 sudo[145325]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:32 compute-2 ceph-mon[76053]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:01:32 compute-2 sudo[145477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jixyovarjrxfmxwjncksyuopkbiblnfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583292.252431-460-3200429841682/AnsiballZ_file.py'
Dec 01 10:01:32 compute-2 sudo[145477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e34000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:32.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:32 compute-2 python3.9[145479]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:32 compute-2 sudo[145477]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:01:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 10:01:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:33.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 10:01:33 compute-2 sudo[145663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpiwwcfkgfswvdmzbiaacwexhxvpmxls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583292.8750913-460-153660519872330/AnsiballZ_file.py'
Dec 01 10:01:33 compute-2 sudo[145663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:33 compute-2 podman[145619]: 2025-12-01 10:01:33.218943388 +0000 UTC m=+0.080941724 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:01:33 compute-2 python3.9[145667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:33 compute-2 sudo[145663]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:33 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:33 compute-2 sudo[145818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgxktjsxldoxrebodfaoxgynntcicawi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583293.56774-460-53407377177928/AnsiballZ_file.py'
Dec 01 10:01:33 compute-2 sudo[145818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:34 compute-2 python3.9[145820]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:34 compute-2 sudo[145818]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:34 compute-2 ceph-mon[76053]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:01:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:34 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:34 compute-2 sudo[145970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ducvbjimrjetgtvnwdqjstbwdkzznnop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583294.189059-460-204690693899471/AnsiballZ_file.py'
Dec 01 10:01:34 compute-2 sudo[145970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:34 compute-2 python3.9[145972]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:34 compute-2 sudo[145970]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100134 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:01:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:34 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e04000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:34.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:35 compute-2 sudo[146122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liaankndyemrhejdhefielorcjsjvfvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583294.8249707-460-99168892310272/AnsiballZ_file.py'
Dec 01 10:01:35 compute-2 sudo[146122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec 01 10:01:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:35.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec 01 10:01:35 compute-2 python3.9[146124]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:35 compute-2 sudo[146122]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:35 compute-2 sudo[146276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgltmqyigxpalxzjmngifcxsaggcjltm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583295.4837463-460-253042921353713/AnsiballZ_file.py'
Dec 01 10:01:35 compute-2 sudo[146276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:01:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:01:35 compute-2 python3.9[146278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:35 compute-2 sudo[146276]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:36 compute-2 ceph-mon[76053]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.4 KiB/s wr, 4 op/s
Dec 01 10:01:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:36 compute-2 sudo[146428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbainuyhhchuewgzxdqsayjeohzkmqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583296.182932-460-233337465228611/AnsiballZ_file.py'
Dec 01 10:01:36 compute-2 sudo[146428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:36 compute-2 python3.9[146430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:01:36 compute-2 sudo[146428]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:36.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:01:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec 01 10:01:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:37.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec 01 10:01:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:37 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:38 compute-2 ceph-mon[76053]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:01:38 compute-2 sudo[146582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orusegifmvdwajspdroyjblkjxttuzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583297.813723-614-198529354177602/AnsiballZ_command.py'
Dec 01 10:01:38 compute-2 sudo[146582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:38 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e30001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:38 compute-2 python3.9[146584]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:38 compute-2 sudo[146582]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:38 compute-2 sudo[146611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:01:38 compute-2 sudo[146611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:38 compute-2 sudo[146611]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:38 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:38.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:39.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:39 compute-2 python3.9[146761]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 10:01:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:39 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100139 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:01:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:01:39 compute-2 sudo[146913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawnzxfreqzkyddvylzmpdbhximxpefg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583299.642417-668-2864700893244/AnsiballZ_systemd_service.py'
Dec 01 10:01:39 compute-2 sudo[146913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:40 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:40 compute-2 python3.9[146915]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:01:40 compute-2 systemd[1]: Reloading.
Dec 01 10:01:40 compute-2 systemd-rc-local-generator[146940]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:01:40 compute-2 systemd-sysv-generator[146943]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:01:40 compute-2 ceph-mon[76053]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:01:40 compute-2 sudo[146913]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:40 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e300025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:41.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:41 compute-2 sudo[147102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdjhaowqayzyfdplyyyyyciwxdkexdbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583300.8561573-692-124978992789424/AnsiballZ_command.py'
Dec 01 10:01:41 compute-2 sudo[147102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:41 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:41 compute-2 python3.9[147104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:41 compute-2 sudo[147102]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:41 compute-2 sudo[147255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewnbdaswsameeocdoybvdecynltzptqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583301.7248375-692-230678673825733/AnsiballZ_command.py'
Dec 01 10:01:41 compute-2 sudo[147255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:42 compute-2 ceph-mon[76053]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 01 10:01:42 compute-2 python3.9[147257]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:42 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:42 compute-2 sudo[147255]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:42 compute-2 sudo[147408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txbvidbpqwifmhyrdqsmqmroiohottlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583302.3618271-692-79304095675437/AnsiballZ_command.py'
Dec 01 10:01:42 compute-2 sudo[147408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:42 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:42.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:42 compute-2 python3.9[147410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:42 compute-2 sudo[147408]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:43.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:43 compute-2 sudo[147562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yflgilbyzwuksmnbyyktwwzbqmrkhwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583303.0097268-692-208756908399532/AnsiballZ_command.py'
Dec 01 10:01:43 compute-2 sudo[147562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:43 compute-2 python3.9[147565]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:43 compute-2 sudo[147562]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:43 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e300025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:43 compute-2 sudo[147716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqprdaxlwmtvriqquitjxsbrsrsivzzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583303.6446643-692-71589851726363/AnsiballZ_command.py'
Dec 01 10:01:43 compute-2 sudo[147716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:44 compute-2 ceph-mon[76053]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:01:44 compute-2 python3.9[147718]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:44 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:01:44 compute-2 sudo[147716]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:44 compute-2 podman[147720]: 2025-12-01 10:01:44.282451064 +0000 UTC m=+0.101522926 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 01 10:01:44 compute-2 sudo[147895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avgqkbwegjdgslnnwdagqtqxakdthgad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583304.3388937-692-219001667747631/AnsiballZ_command.py'
Dec 01 10:01:44 compute-2 sudo[147895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:44 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c002b10 fd 42 proxy ignored for local
Dec 01 10:01:44 compute-2 kernel: ganesha.nfsd[145495]: segfault at 50 ip 00007f6ee085e32e sp 00007f6e997f9210 error 4 in libntirpc.so.5.8[7f6ee0843000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 01 10:01:44 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:01:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:44.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:44 compute-2 systemd[1]: Started Process Core Dump (PID 147898/UID 0).
Dec 01 10:01:44 compute-2 python3.9[147897]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:44 compute-2 sudo[147895]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:45.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:45 compute-2 sudo[148051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjrrhagmrkbtreidzlaqwwngbiqlpas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583305.00361-692-148773742866866/AnsiballZ_command.py'
Dec 01 10:01:45 compute-2 sudo[148051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:45 compute-2 python3.9[148054]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:01:45 compute-2 sudo[148051]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:46 compute-2 ceph-mon[76053]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:01:46 compute-2 systemd-coredump[147899]: Process 143295 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007f6ee085e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:01:46 compute-2 systemd[1]: systemd-coredump@7-147898-0.service: Deactivated successfully.
Dec 01 10:01:46 compute-2 systemd[1]: systemd-coredump@7-147898-0.service: Consumed 1.666s CPU time.
Dec 01 10:01:46 compute-2 podman[148084]: 2025-12-01 10:01:46.533056297 +0000 UTC m=+0.029902198 container died 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 10:01:46 compute-2 systemd[1]: var-lib-containers-storage-overlay-59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc-merged.mount: Deactivated successfully.
Dec 01 10:01:46 compute-2 podman[148084]: 2025-12-01 10:01:46.592329609 +0000 UTC m=+0.089175490 container remove 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 01 10:01:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:01:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:46.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:01:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.935s CPU time.
Dec 01 10:01:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:47 compute-2 sudo[148254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-togwrhhxbmnlqbmxwibxpnaxqonmdqoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583306.8465464-853-275524021380419/AnsiballZ_getent.py'
Dec 01 10:01:47 compute-2 sudo[148254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:47 compute-2 python3.9[148256]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 01 10:01:47 compute-2 sudo[148254]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:48 compute-2 ceph-mon[76053]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 01 10:01:48 compute-2 sudo[148407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjkzwgarxqrawulupixuvbykzudrtdfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583307.8593087-878-202824444029996/AnsiballZ_group.py'
Dec 01 10:01:48 compute-2 sudo[148407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:48 compute-2 python3.9[148409]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 10:01:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:49 compute-2 groupadd[148410]: group added to /etc/group: name=libvirt, GID=42473
Dec 01 10:01:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:49 compute-2 groupadd[148410]: group added to /etc/gshadow: name=libvirt
Dec 01 10:01:49 compute-2 groupadd[148410]: new group: name=libvirt, GID=42473
Dec 01 10:01:49 compute-2 sudo[148407]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:50 compute-2 ceph-mon[76053]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 01 10:01:50 compute-2 sudo[148567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwatdopbuwmztrectmrkegycmmqnofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583309.6667092-901-164174677204081/AnsiballZ_user.py'
Dec 01 10:01:50 compute-2 sudo[148567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:50 compute-2 python3.9[148569]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 10:01:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100150 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:01:50 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:01:50 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:01:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:51 compute-2 useradd[148571]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 01 10:01:51 compute-2 sudo[148567]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:52 compute-2 ceph-mon[76053]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Dec 01 10:01:52 compute-2 sudo[148730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqfbewelvpsbxeuildeebftxxakobgjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583312.1938603-934-184226566030866/AnsiballZ_setup.py'
Dec 01 10:01:52 compute-2 sudo[148730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:52 compute-2 python3.9[148732]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 10:01:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:53 compute-2 sudo[148730]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:53 compute-2 sudo[148816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oohwwihtjzmednopglecjbxgfkfplnds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583312.1938603-934-184226566030866/AnsiballZ_dnf.py'
Dec 01 10:01:53 compute-2 sudo[148816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:01:53 compute-2 python3.9[148818]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 10:01:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:54 compute-2 ceph-mon[76053]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:01:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:54.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:01:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:55.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:56 compute-2 ceph-mon[76053]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:01:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:56.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:56 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 8.
Dec 01 10:01:56 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:01:56 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.935s CPU time.
Dec 01 10:01:56 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:01:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:57 compute-2 podman[148875]: 2025-12-01 10:01:57.077219173 +0000 UTC m=+0.052807614 container create 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:01:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:01:57 compute-2 podman[148875]: 2025-12-01 10:01:57.149354721 +0000 UTC m=+0.124943192 container init 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec 01 10:01:57 compute-2 podman[148875]: 2025-12-01 10:01:57.05440199 +0000 UTC m=+0.029990461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:01:57 compute-2 podman[148875]: 2025-12-01 10:01:57.156482188 +0000 UTC m=+0.132070639 container start 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:01:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:57.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:01:57 compute-2 bash[148875]: 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb
Dec 01 10:01:57 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:01:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:01:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:58 compute-2 sudo[148935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:01:58 compute-2 sudo[148935]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:01:58 compute-2 sudo[148935]: pam_unix(sudo:session): session closed for user root
Dec 01 10:01:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:01:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:58.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:01:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:01:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:01:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:01:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:01:59 compute-2 ceph-mon[76053]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:01:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:00 compute-2 ceph-mon[76053]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:02:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:01.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:02.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:03 compute-2 ceph-mon[76053]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 511 B/s wr, 1 op/s
Dec 01 10:02:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:03.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:03 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:02:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:03 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:02:03 compute-2 podman[148966]: 2025-12-01 10:02:03.409225909 +0000 UTC m=+0.063791494 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec 01 10:02:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:04 compute-2 ceph-mon[76053]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Dec 01 10:02:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.682 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:02:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:02:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:02:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:04.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:02:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:05.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:02:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:02:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:06.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:02:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:07 compute-2 ceph-mon[76053]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:02:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:02:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:07.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:02:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:08 compute-2 ceph-mon[76053]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:02:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:08.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:09.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78ec000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:02:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:10 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:10 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:10.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:10 compute-2 ceph-mon[76053]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:02:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:02:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:02:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:11 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:12 compute-2 ceph-mon[76053]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:02:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:12 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100212 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:02:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:12 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:13 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:14 compute-2 ceph-mon[76053]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Dec 01 10:02:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:14 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:14 compute-2 podman[149184]: 2025-12-01 10:02:14.444782582 +0000 UTC m=+0.099774699 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:02:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:14 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:14.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:15.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:15 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:16 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:16 compute-2 ceph-mon[76053]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Dec 01 10:02:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:16 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:16.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:17.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:17 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:18 compute-2 ceph-mon[76053]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:02:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:18 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:18 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:18.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:18 compute-2 sudo[149214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:02:18 compute-2 sudo[149214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:18 compute-2 sudo[149214]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:19.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:19 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:20 compute-2 ceph-mon[76053]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:02:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:20 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:20 compute-2 sudo[149243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:02:20 compute-2 sudo[149243]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:20 compute-2 sudo[149243]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:20 compute-2 sudo[149268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:02:20 compute-2 sudo[149268]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:20 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:20.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:21 compute-2 sudo[149268]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:21 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:02:21 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:02:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:21 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:22 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:22 compute-2 ceph-mon[76053]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 552 B/s rd, 92 B/s wr, 0 op/s
Dec 01 10:02:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:02:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:02:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:02:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:02:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:02:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:22 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:22.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:23.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:23 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:24 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:24 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:24.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:24 compute-2 ceph-mon[76053]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:02:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:02:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:02:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:25.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:02:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:25 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:26 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:26 compute-2 ceph-mon[76053]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:02:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:26 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:26.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:27.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:27 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:27 compute-2 sudo[149335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:02:27 compute-2 sudo[149335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:27 compute-2 sudo[149335]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:28 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:28 compute-2 ceph-mon[76053]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:02:28 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:02:28 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:02:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:28 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:28.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:29.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:29 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:30 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:30 compute-2 ceph-mon[76053]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:02:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:30 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:30.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:31.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:31 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:32 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:32 compute-2 kernel: SELinux:  Converting 2773 SID table entries...
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 10:02:32 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 10:02:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:32 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003a30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:32.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:33 compute-2 ceph-mon[76053]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 552 B/s rd, 0 op/s
Dec 01 10:02:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:33.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:33 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:34 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:34 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 01 10:02:34 compute-2 ceph-mon[76053]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:02:34 compute-2 podman[149374]: 2025-12-01 10:02:34.402910832 +0000 UTC m=+0.052794363 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 01 10:02:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:34 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:34.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:35.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:35 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003a30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:36 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:02:36 compute-2 ceph-mon[76053]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:02:36 compute-2 kernel: ganesha.nfsd[149017]: segfault at 50 ip 00007f799cb3632e sp 00007f796affc210 error 4 in libntirpc.so.5.8[7f799cb1b000+2c000] likely on CPU 6 (core 0, socket 6)
Dec 01 10:02:36 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:02:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:36 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy ignored for local
Dec 01 10:02:36 compute-2 systemd[1]: Started Process Core Dump (PID 149394/UID 0).
Dec 01 10:02:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:37.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:38 compute-2 systemd-coredump[149395]: Process 148896 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f799cb3632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:02:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:38 compute-2 systemd[1]: systemd-coredump@8-149394-0.service: Deactivated successfully.
Dec 01 10:02:38 compute-2 systemd[1]: systemd-coredump@8-149394-0.service: Consumed 1.303s CPU time.
Dec 01 10:02:38 compute-2 podman[149402]: 2025-12-01 10:02:38.211114154 +0000 UTC m=+0.028330097 container died 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 01 10:02:38 compute-2 systemd[1]: var-lib-containers-storage-overlay-da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b-merged.mount: Deactivated successfully.
Dec 01 10:02:38 compute-2 podman[149402]: 2025-12-01 10:02:38.269485381 +0000 UTC m=+0.086701294 container remove 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:02:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:02:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:02:38 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.582s CPU time.
Dec 01 10:02:38 compute-2 ceph-mon[76053]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:02:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:38.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:38 compute-2 sudo[149445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:02:38 compute-2 sudo[149445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:38 compute-2 sudo[149445]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:39.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:02:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:40.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:40 compute-2 ceph-mon[76053]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:02:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:41.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100242 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:02:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:42.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:42 compute-2 ceph-mon[76053]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:02:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:43.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:43 compute-2 kernel: SELinux:  Converting 2773 SID table entries...
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 10:02:43 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 10:02:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:44 compute-2 ceph-mon[76053]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:02:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:44.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:45.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:45 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 01 10:02:45 compute-2 podman[149485]: 2025-12-01 10:02:45.463474635 +0000 UTC m=+0.109457187 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 01 10:02:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:46 compute-2 ceph-mon[76053]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:02:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:46.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:47.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:48 compute-2 ceph-mon[76053]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:02:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 9.
Dec 01 10:02:48 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:02:48 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.582s CPU time.
Dec 01 10:02:48 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:02:48 compute-2 podman[149560]: 2025-12-01 10:02:48.80843538 +0000 UTC m=+0.047241152 container create 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:02:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:02:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:02:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:02:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:02:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:48.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:48 compute-2 podman[149560]: 2025-12-01 10:02:48.874930115 +0000 UTC m=+0.113735907 container init 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 01 10:02:48 compute-2 podman[149560]: 2025-12-01 10:02:48.880110678 +0000 UTC m=+0.118916450 container start 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 10:02:48 compute-2 podman[149560]: 2025-12-01 10:02:48.785732258 +0000 UTC m=+0.024538060 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:02:48 compute-2 bash[149560]: 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb
Dec 01 10:02:48 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:02:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:49 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:02:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:50 compute-2 ceph-mon[76053]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:02:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:50.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:51.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:52 compute-2 ceph-mon[76053]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:02:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:52.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:02:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:53.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:54 compute-2 ceph-mon[76053]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:02:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:54.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:02:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:02:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:55.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:02:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:02:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:57.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:02:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:58.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:02:58 compute-2 sudo[149625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:02:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:59 compute-2 sudo[149625]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:02:59 compute-2 sudo[149625]: pam_unix(sudo:session): session closed for user root
Dec 01 10:02:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:02:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:02:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:02:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:00 compute-2 ceph-mon[76053]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:03:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100300 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:03:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:03:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:01 compute-2 ceph-mon[76053]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 853 B/s wr, 2 op/s
Dec 01 10:03:01 compute-2 ceph-mon[76053]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 852 B/s wr, 2 op/s
Dec 01 10:03:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:02 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:02 compute-2 ceph-mon[76053]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:03:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:02 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:02.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:03 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:04 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:04 compute-2 ceph-mon[76053]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Dec 01 10:03:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:03:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.686 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:03:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.686 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:03:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100304 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:03:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:04 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:04.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:05.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:05 compute-2 podman[152840]: 2025-12-01 10:03:05.408340116 +0000 UTC m=+0.057363022 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 01 10:03:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:05 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8001c40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:06 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:06 compute-2 ceph-mon[76053]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Dec 01 10:03:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:06 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:06.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:07.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:07 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:08 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8002740 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:08 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:08 compute-2 ceph-mon[76053]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 852 B/s wr, 2 op/s
Dec 01 10:03:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:09 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:03:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:09.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:09 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:03:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:10 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:10 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:10 compute-2 ceph-mon[76053]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:03:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:10.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:11.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:11 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:12 compute-2 ceph-mon[76053]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Dec 01 10:03:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:12.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:13.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:13 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:14 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:14 compute-2 ceph-mon[76053]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:03:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:14 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:03:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:14.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:03:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:15 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:03:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:15 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:16 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:16 compute-2 podman[159955]: 2025-12-01 10:03:16.432867894 +0000 UTC m=+0.089116796 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:03:16 compute-2 ceph-mon[76053]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:03:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:16 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:03:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:03:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:03:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:03:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:17 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:18 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:18 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:18 compute-2 ceph-mon[76053]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:03:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:03:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:18.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:03:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:19 compute-2 sudo[161681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:03:19 compute-2 sudo[161681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:19 compute-2 sudo[161681]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:19 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:20 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100320 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:03:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:20 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:20 compute-2 ceph-mon[76053]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:03:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:21.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:21 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:22 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:22 compute-2 ceph-mon[76053]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:03:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:22 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:22.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:23.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:23 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:24 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:24 compute-2 ceph-mon[76053]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:03:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:24 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:24.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:25.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:25 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:03:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:26 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:26 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:26.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:27.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:27 compute-2 ceph-mon[76053]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:03:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:27 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:27 compute-2 sudo[166548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:03:27 compute-2 sudo[166548]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:27 compute-2 sudo[166548]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:27 compute-2 sudo[166573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 10:03:27 compute-2 sudo[166573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:28 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:28 compute-2 ceph-mon[76053]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:03:28 compute-2 podman[166672]: 2025-12-01 10:03:28.575308515 +0000 UTC m=+0.073198875 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 10:03:28 compute-2 podman[166672]: 2025-12-01 10:03:28.691351645 +0000 UTC m=+0.189241975 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:03:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:28 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:29 compute-2 podman[166789]: 2025-12-01 10:03:29.210767014 +0000 UTC m=+0.067084414 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:03:29 compute-2 podman[166789]: 2025-12-01 10:03:29.21870237 +0000 UTC m=+0.075019790 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:03:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:29.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:29 compute-2 podman[166882]: 2025-12-01 10:03:29.578781643 +0000 UTC m=+0.059340604 container exec 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325)
Dec 01 10:03:29 compute-2 podman[166882]: 2025-12-01 10:03:29.593995998 +0000 UTC m=+0.074554929 container exec_died 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:03:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:29 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:29 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 10:03:29 compute-2 podman[166945]: 2025-12-01 10:03:29.81807752 +0000 UTC m=+0.059640531 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:03:29 compute-2 podman[166945]: 2025-12-01 10:03:29.846648254 +0000 UTC m=+0.088211255 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:03:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:30 compute-2 podman[167013]: 2025-12-01 10:03:30.235433495 +0000 UTC m=+0.072772614 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, release=1793, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, version=2.2.4, vcs-type=git)
Dec 01 10:03:30 compute-2 podman[167033]: 2025-12-01 10:03:30.298788076 +0000 UTC m=+0.051326976 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, distribution-scope=public, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, release=1793, io.openshift.tags=Ceph keepalived, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, description=keepalived for Ceph, com.redhat.component=keepalived-container)
Dec 01 10:03:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:30 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:30 compute-2 podman[167013]: 2025-12-01 10:03:30.308472705 +0000 UTC m=+0.145811814 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, architecture=x86_64, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec 01 10:03:30 compute-2 sudo[166573]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:30 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:30 compute-2 ceph-mon[76053]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:03:30 compute-2 sudo[167088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:03:30 compute-2 sudo[167088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:30 compute-2 sudo[167088]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:30.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:30 compute-2 sudo[167113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:03:30 compute-2 sudo[167113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:31.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:31 compute-2 sudo[167113]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:31 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:32 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:32 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:32 compute-2 ceph-mon[76053]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:03:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:03:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:32.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:33.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:33 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:34 compute-2 ceph-mon[76053]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:03:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:34 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:34 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:34.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:35.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:35 compute-2 ceph-mon[76053]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:03:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:35 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:36 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:36 compute-2 podman[167176]: 2025-12-01 10:03:36.43667411 +0000 UTC m=+0.078831064 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 10:03:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:36 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:36.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:37.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:37 compute-2 sudo[167198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:03:37 compute-2 sudo[167198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:37 compute-2 sudo[167198]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:37 compute-2 ceph-mon[76053]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 461 B/s rd, 0 op/s
Dec 01 10:03:37 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:37 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:03:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:37 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:38 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:38 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:39 compute-2 sudo[167224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:03:39 compute-2 sudo[167224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:39 compute-2 sudo[167224]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:39 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:39 compute-2 ceph-mon[76053]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:03:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:03:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:40 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:40 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:40.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:41 compute-2 ceph-mon[76053]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:03:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:41 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:42 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:42 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:42.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:43 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:43 compute-2 kernel: SELinux:  Converting 2774 SID table entries...
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability open_perms=1
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability always_check_network=0
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 01 10:03:43 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 01 10:03:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:44 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:44 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:44.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:45 compute-2 ceph-mon[76053]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 276 B/s rd, 0 op/s
Dec 01 10:03:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:45 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:45 compute-2 ceph-mon[76053]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:03:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:46 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:46 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 01 10:03:46 compute-2 groupadd[167269]: group added to /etc/group: name=dnsmasq, GID=992
Dec 01 10:03:46 compute-2 groupadd[167269]: group added to /etc/gshadow: name=dnsmasq
Dec 01 10:03:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:46 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:46 compute-2 groupadd[167269]: new group: name=dnsmasq, GID=992
Dec 01 10:03:46 compute-2 useradd[167286]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 01 10:03:46 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 10:03:46 compute-2 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec 01 10:03:46 compute-2 podman[167268]: 2025-12-01 10:03:46.884081921 +0000 UTC m=+0.102402954 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 10:03:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:47 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:47 compute-2 ceph-mon[76053]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:03:47 compute-2 groupadd[167316]: group added to /etc/group: name=clevis, GID=991
Dec 01 10:03:47 compute-2 groupadd[167316]: group added to /etc/gshadow: name=clevis
Dec 01 10:03:47 compute-2 groupadd[167316]: new group: name=clevis, GID=991
Dec 01 10:03:47 compute-2 useradd[167323]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 01 10:03:47 compute-2 usermod[167333]: add 'clevis' to group 'tss'
Dec 01 10:03:47 compute-2 usermod[167333]: add 'clevis' to shadow group 'tss'
Dec 01 10:03:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009af0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:48.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:49 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:50 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:50 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:51 compute-2 ceph-mon[76053]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:03:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:51 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:52 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:52 compute-2 ceph-mon[76053]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:03:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:52 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:52.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:53 compute-2 polkitd[43617]: Reloading rules
Dec 01 10:03:53 compute-2 polkitd[43617]: Collecting garbage unconditionally...
Dec 01 10:03:53 compute-2 polkitd[43617]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 10:03:53 compute-2 polkitd[43617]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 10:03:53 compute-2 polkitd[43617]: Finished loading, compiling and executing 3 rules
Dec 01 10:03:53 compute-2 polkitd[43617]: Reloading rules
Dec 01 10:03:53 compute-2 polkitd[43617]: Collecting garbage unconditionally...
Dec 01 10:03:53 compute-2 polkitd[43617]: Loading rules from directory /etc/polkit-1/rules.d
Dec 01 10:03:53 compute-2 polkitd[43617]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 01 10:03:53 compute-2 polkitd[43617]: Finished loading, compiling and executing 3 rules
Dec 01 10:03:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:53 compute-2 ceph-mon[76053]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:03:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:53 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:54 compute-2 groupadd[167526]: group added to /etc/group: name=ceph, GID=167
Dec 01 10:03:54 compute-2 groupadd[167526]: group added to /etc/gshadow: name=ceph
Dec 01 10:03:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:54 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:54 compute-2 groupadd[167526]: new group: name=ceph, GID=167
Dec 01 10:03:54 compute-2 useradd[167532]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 01 10:03:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:03:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:54 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:54.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:55 compute-2 ceph-mon[76053]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:03:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:03:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:56 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:56 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:56.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:03:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:57.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:03:57 compute-2 systemd[1]: Stopping OpenSSH server daemon...
Dec 01 10:03:57 compute-2 sshd[1008]: Received signal 15; terminating.
Dec 01 10:03:57 compute-2 systemd[1]: sshd.service: Deactivated successfully.
Dec 01 10:03:57 compute-2 systemd[1]: Stopped OpenSSH server daemon.
Dec 01 10:03:57 compute-2 systemd[1]: sshd.service: Consumed 4.342s CPU time, read 32.0K from disk, written 112.0K to disk.
Dec 01 10:03:57 compute-2 systemd[1]: Stopped target sshd-keygen.target.
Dec 01 10:03:57 compute-2 systemd[1]: Stopping sshd-keygen.target...
Dec 01 10:03:57 compute-2 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 10:03:57 compute-2 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 10:03:57 compute-2 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 01 10:03:57 compute-2 systemd[1]: Reached target sshd-keygen.target.
Dec 01 10:03:57 compute-2 systemd[1]: Starting OpenSSH server daemon...
Dec 01 10:03:57 compute-2 sshd[168201]: Server listening on 0.0.0.0 port 22.
Dec 01 10:03:57 compute-2 sshd[168201]: Server listening on :: port 22.
Dec 01 10:03:57 compute-2 systemd[1]: Started OpenSSH server daemon.
Dec 01 10:03:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:57 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:57 compute-2 ceph-mon[76053]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:03:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:58 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:03:58 compute-2 kernel: ganesha.nfsd[150221]: segfault at 50 ip 00007f02aa83632e sp 00007f02767fb210 error 4 in libntirpc.so.5.8[7f02aa81b000+2c000] likely on CPU 4 (core 0, socket 4)
Dec 01 10:03:58 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:03:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:58 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy ignored for local
Dec 01 10:03:58 compute-2 systemd[1]: Started Process Core Dump (PID 168358/UID 0).
Dec 01 10:03:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:58.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:03:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:03:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:03:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:59.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:03:59 compute-2 sudo[168400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:03:59 compute-2 sudo[168400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:03:59 compute-2 sudo[168400]: pam_unix(sudo:session): session closed for user root
Dec 01 10:03:59 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 10:03:59 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 10:03:59 compute-2 systemd[1]: Reloading.
Dec 01 10:03:59 compute-2 systemd-sysv-generator[168488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:03:59 compute-2 systemd-rc-local-generator[168484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:00 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 10:04:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:01.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:01.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:03.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:03 compute-2 systemd-coredump[168361]: Process 149577 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 41:
                                                    #0  0x00007f02aa83632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:04:03 compute-2 ceph-mon[76053]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:03 compute-2 systemd[1]: systemd-coredump@9-168358-0.service: Deactivated successfully.
Dec 01 10:04:03 compute-2 systemd[1]: systemd-coredump@9-168358-0.service: Consumed 1.523s CPU time.
Dec 01 10:04:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:03 compute-2 podman[168511]: 2025-12-01 10:04:03.332429783 +0000 UTC m=+0.026485153 container died 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:04:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:04 compute-2 systemd[1]: var-lib-containers-storage-overlay-ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad-merged.mount: Deactivated successfully.
Dec 01 10:04:04 compute-2 ceph-mon[76053]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:04 compute-2 podman[168511]: 2025-12-01 10:04:04.170721802 +0000 UTC m=+0.864777172 container remove 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:04:04 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:04:04 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:04:04 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.894s CPU time.
Dec 01 10:04:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.687 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:04:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.689 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:04:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.689 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:04:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:05 compute-2 ceph-mon[76053]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:04:05 compute-2 ceph-mon[76053]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:05 compute-2 sudo[148816]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:06 compute-2 sudo[171093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkxptuuiisbcwgusihbgunjpkuljwbmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583446.147755-970-75524096302446/AnsiballZ_systemd.py'
Dec 01 10:04:06 compute-2 sudo[171093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:06 compute-2 podman[171008]: 2025-12-01 10:04:06.760508378 +0000 UTC m=+0.060749247 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 10:04:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:07 compute-2 python3.9[171121]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:04:07 compute-2 systemd[1]: Reloading.
Dec 01 10:04:07 compute-2 systemd-sysv-generator[171665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:07 compute-2 systemd-rc-local-generator[171659]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:07 compute-2 sudo[171093]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:07 compute-2 ceph-mon[76053]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:04:07 compute-2 sudo[172542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yveimmnwgvlukbriinuscgpmspsfgqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583447.6517022-970-152070838795260/AnsiballZ_systemd.py'
Dec 01 10:04:07 compute-2 sudo[172542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:08 compute-2 python3.9[172569]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:04:08 compute-2 systemd[1]: Reloading.
Dec 01 10:04:08 compute-2 systemd-rc-local-generator[173051]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:08 compute-2 systemd-sysv-generator[173056]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:08 compute-2 sudo[172542]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:04:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:09.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:09 compute-2 sudo[173944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsojmrrepgfykbwfohzolwmzhgmfooro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583448.8271508-970-214072128117312/AnsiballZ_systemd.py'
Dec 01 10:04:09 compute-2 sudo[173944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:09 compute-2 python3.9[173974]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:04:09 compute-2 systemd[1]: Reloading.
Dec 01 10:04:09 compute-2 systemd-rc-local-generator[174474]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:09 compute-2 systemd-sysv-generator[174481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:10 compute-2 ceph-mon[76053]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:04:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:04:10 compute-2 sudo[173944]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:11.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:11 compute-2 sudo[175895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scsyaoroppkzhhkpmtoivuxfwylqgvkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583450.7096055-970-5353252598380/AnsiballZ_systemd.py'
Dec 01 10:04:11 compute-2 sudo[175895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:11.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:11 compute-2 python3.9[175915]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:04:11 compute-2 systemd[1]: Reloading.
Dec 01 10:04:11 compute-2 systemd-rc-local-generator[176387]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:11 compute-2 systemd-sysv-generator[176394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:11 compute-2 ceph-mon[76053]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:04:11 compute-2 sudo[175895]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:12 compute-2 sudo[177345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vakapdhygzfxurswarwjbpnnxbtfylqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583452.067816-1058-262221515697826/AnsiballZ_systemd.py'
Dec 01 10:04:12 compute-2 sudo[177345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:12 compute-2 python3.9[177367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:12 compute-2 systemd[1]: Reloading.
Dec 01 10:04:12 compute-2 systemd-sysv-generator[177800]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:12 compute-2 systemd-rc-local-generator[177794]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:13.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:13 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 10:04:13 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 10:04:13 compute-2 systemd[1]: man-db-cache-update.service: Consumed 11.449s CPU time.
Dec 01 10:04:13 compute-2 systemd[1]: run-r7aaf59b4f8814f2a97f380156aee268e.service: Deactivated successfully.
Dec 01 10:04:13 compute-2 sudo[177345]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:13.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:13 compute-2 sudo[178044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwrqftzknmweraytugetcfhrzqqrdwql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583453.2548-1058-79316623653695/AnsiballZ_systemd.py'
Dec 01 10:04:13 compute-2 sudo[178044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:13 compute-2 ceph-mon[76053]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:13 compute-2 python3.9[178046]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:13 compute-2 systemd[1]: Reloading.
Dec 01 10:04:14 compute-2 systemd-sysv-generator[178078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:14 compute-2 systemd-rc-local-generator[178075]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:14 compute-2 sudo[178044]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:14 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 10.
Dec 01 10:04:14 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:04:14 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.894s CPU time.
Dec 01 10:04:14 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:04:14 compute-2 sudo[178268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcbleussezraxkrpwpkyyeavtfmaslzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583454.4571517-1058-19264538366508/AnsiballZ_systemd.py'
Dec 01 10:04:14 compute-2 sudo[178268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:14 compute-2 podman[178281]: 2025-12-01 10:04:14.824468006 +0000 UTC m=+0.050514956 container create 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:04:14 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:04:14 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:04:14 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:04:14 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:04:14 compute-2 podman[178281]: 2025-12-01 10:04:14.886498594 +0000 UTC m=+0.112545564 container init 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:04:14 compute-2 podman[178281]: 2025-12-01 10:04:14.892137973 +0000 UTC m=+0.118184923 container start 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 10:04:14 compute-2 podman[178281]: 2025-12-01 10:04:14.79868865 +0000 UTC m=+0.024735630 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:04:14 compute-2 bash[178281]: 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc
Dec 01 10:04:14 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:04:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:04:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:15.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:15 compute-2 python3.9[178280]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:15 compute-2 systemd[1]: Reloading.
Dec 01 10:04:15 compute-2 systemd-rc-local-generator[178369]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:15 compute-2 systemd-sysv-generator[178372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:15.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:15 compute-2 sudo[178268]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:15 compute-2 ceph-mon[76053]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:04:15 compute-2 sudo[178527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyfwxezeyredqeohlsviyhkmrqriljbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583455.6000896-1058-148962463712922/AnsiballZ_systemd.py'
Dec 01 10:04:15 compute-2 sudo[178527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:16 compute-2 python3.9[178529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:16 compute-2 sudo[178527]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:16 compute-2 sudo[178682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alxoobeoetqmfeucfvpyfrmgxnomzpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583456.5255232-1058-264432541212628/AnsiballZ_systemd.py'
Dec 01 10:04:16 compute-2 sudo[178682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:04:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:17.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:04:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:17 compute-2 python3.9[178684]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:17 compute-2 systemd[1]: Reloading.
Dec 01 10:04:17 compute-2 podman[178687]: 2025-12-01 10:04:17.288496546 +0000 UTC m=+0.094487160 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller)
Dec 01 10:04:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:17.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:17 compute-2 systemd-rc-local-generator[178743]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:17 compute-2 systemd-sysv-generator[178746]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:17 compute-2 sudo[178682]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:17 compute-2 ceph-mon[76053]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:04:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:19.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:19.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:19 compute-2 sudo[178777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:04:19 compute-2 sudo[178777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:19 compute-2 sudo[178777]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:19 compute-2 ceph-mon[76053]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:04:19 compute-2 sudo[178927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnjowjhvuqktabibdwplowapkjueueci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583459.4965422-1166-237433889874113/AnsiballZ_systemd.py'
Dec 01 10:04:19 compute-2 sudo[178927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:20 compute-2 python3.9[178929]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 01 10:04:20 compute-2 systemd[1]: Reloading.
Dec 01 10:04:20 compute-2 systemd-sysv-generator[178962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:04:20 compute-2 systemd-rc-local-generator[178957]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:04:20 compute-2 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 01 10:04:20 compute-2 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 01 10:04:20 compute-2 sudo[178927]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:21 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:04:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:21 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:04:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:21.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:21 compute-2 sudo[179121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwfncjnshfbayjomndjxylyrxzjvrzbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583460.9344356-1190-130943048524287/AnsiballZ_systemd.py'
Dec 01 10:04:21 compute-2 sudo[179121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:21 compute-2 python3.9[179123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:21 compute-2 sudo[179121]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:21 compute-2 ceph-mon[76053]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:04:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:22 compute-2 sudo[179277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpskodmywebushidfgjvchqfpqsjmtot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583461.7747521-1190-257248326237275/AnsiballZ_systemd.py'
Dec 01 10:04:22 compute-2 sudo[179277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:22 compute-2 python3.9[179279]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:22 compute-2 sudo[179277]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:22 compute-2 sudo[179432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmybysumpaeiphmaahngavpgunbmhjic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583462.6081588-1190-45929000200015/AnsiballZ_systemd.py'
Dec 01 10:04:22 compute-2 sudo[179432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:23 compute-2 python3.9[179434]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:23 compute-2 sudo[179432]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:23.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:23 compute-2 sudo[179589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aocedmarwytkhaohmhpdzxyvmhyfimhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583463.4369051-1190-235346128457061/AnsiballZ_systemd.py'
Dec 01 10:04:23 compute-2 sudo[179589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:23 compute-2 ceph-mon[76053]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:04:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:24 compute-2 python3.9[179591]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:24 compute-2 sudo[179589]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:24 compute-2 sudo[179744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blwyxwxxpbhwcxsnwkjqsdjqielwumgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583464.2917218-1190-191135312165958/AnsiballZ_systemd.py'
Dec 01 10:04:24 compute-2 sudo[179744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:24 compute-2 python3.9[179746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:04:24 compute-2 sudo[179744]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:04:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:25.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:04:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:25.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:25 compute-2 sudo[179901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitvpfwdbzddcnsinsrhjrdpcejqbsce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583465.0873065-1190-157399489073895/AnsiballZ_systemd.py'
Dec 01 10:04:25 compute-2 sudo[179901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:25 compute-2 python3.9[179903]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:25 compute-2 sudo[179901]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:25 compute-2 ceph-mon[76053]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:04:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:26 compute-2 sudo[180056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxgfgarzasvdbzbfcxuuraovucinnncm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583465.9459498-1190-58964600869824/AnsiballZ_systemd.py'
Dec 01 10:04:26 compute-2 sudo[180056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:26 compute-2 python3.9[180058]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:26 compute-2 sudo[180056]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:27.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:27 compute-2 sudo[180224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltprqjpxruuevxhmbgpvsfxhridkbbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583466.8109903-1190-272857686219319/AnsiballZ_systemd.py'
Dec 01 10:04:27 compute-2 sudo[180224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:27 compute-2 python3.9[180226]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:27 compute-2 sudo[180224]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc068000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:27 compute-2 sudo[180384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmwdoxxbwloidnjxylrngnrlbelsssfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583467.6234436-1190-118905389668115/AnsiballZ_systemd.py'
Dec 01 10:04:27 compute-2 sudo[180384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:27 compute-2 ceph-mon[76053]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:04:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:28 compute-2 python3.9[180386]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:28 compute-2 sudo[180384]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:28 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc064001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:28 compute-2 sudo[180539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttsyuwnjlpbhgjzyhfnbqobbioqleckh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583468.443584-1190-88969968895860/AnsiballZ_systemd.py'
Dec 01 10:04:28 compute-2 sudo[180539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:28 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:29 compute-2 python3.9[180541]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:29.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:29 compute-2 sudo[180539]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:29.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:29 compute-2 sudo[180696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvleuewxoakopwisesdwrhczlybscmya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583469.2261431-1190-98290679578744/AnsiballZ_systemd.py'
Dec 01 10:04:29 compute-2 sudo[180696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:29 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc068000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:29 compute-2 python3.9[180698]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:29 compute-2 sudo[180696]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:30 compute-2 ceph-mon[76053]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:04:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:30 compute-2 sudo[180851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcxnmcwabiwnkuhzafrzbpbbaahrxbyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583470.0101655-1190-69973456835396/AnsiballZ_systemd.py'
Dec 01 10:04:30 compute-2 sudo[180851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:30 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:30 compute-2 python3.9[180853]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:30 compute-2 sudo[180851]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100430 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:04:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:30 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:31.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:31 compute-2 sudo[181006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xamfitbctisdrpvdifyvnzubyabwfrce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583470.7946281-1190-189416109341971/AnsiballZ_systemd.py'
Dec 01 10:04:31 compute-2 sudo[181006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:31.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:31 compute-2 python3.9[181008]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:31 compute-2 sudo[181006]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:31 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:31 compute-2 sudo[181163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikxzobqatstustyyyzkaanduuttnixsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583471.5646331-1190-24595047842638/AnsiballZ_systemd.py'
Dec 01 10:04:31 compute-2 sudo[181163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:32 compute-2 ceph-mon[76053]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:04:32 compute-2 python3.9[181165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 01 10:04:32 compute-2 sudo[181163]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:32 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:32 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:33.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:33 compute-2 ceph-mon[76053]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:04:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:33 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:34 compute-2 sudo[181320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swftmgjadlskgdxozpcioiwunahknavv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583474.0766342-1496-189262459781145/AnsiballZ_file.py'
Dec 01 10:04:34 compute-2 sudo[181320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:34 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:34 compute-2 python3.9[181322]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:34 compute-2 sudo[181320]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:34 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:35 compute-2 sudo[181472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uelevcmnbnpshtajhlyncepargqhqfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583474.7668092-1496-225158721397033/AnsiballZ_file.py'
Dec 01 10:04:35 compute-2 sudo[181472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:35.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:35 compute-2 python3.9[181474]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:35 compute-2 sudo[181472]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:35.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:35 compute-2 ceph-mon[76053]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:04:35 compute-2 sudo[181626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivcmqhyheclghujqdszatnoynczwuzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583475.4023561-1496-130460913147657/AnsiballZ_file.py'
Dec 01 10:04:35 compute-2 sudo[181626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:35 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:35 compute-2 python3.9[181628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:35 compute-2 sudo[181626]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:36 compute-2 sudo[181778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvcqgxafyrhumeyfypdzgzkfkvptmmqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583476.0109274-1496-87349012105238/AnsiballZ_file.py'
Dec 01 10:04:36 compute-2 sudo[181778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:36 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:36 compute-2 python3.9[181780]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:36 compute-2 sudo[181778]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:36 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:36 compute-2 sudo[181943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhjhsmllbivsgpognqpkfbfbxpgdelhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583476.661134-1496-80381463979959/AnsiballZ_file.py'
Dec 01 10:04:36 compute-2 sudo[181943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:36 compute-2 podman[181904]: 2025-12-01 10:04:36.951074874 +0000 UTC m=+0.055990461 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 10:04:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:37.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:37 compute-2 python3.9[181951]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:37 compute-2 sudo[181943]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:37.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:37 compute-2 sudo[182103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgfrxankjjgkeqpwhtnxrfnshcqcowgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583477.279749-1496-228059726466585/AnsiballZ_file.py'
Dec 01 10:04:37 compute-2 sudo[182103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:37 compute-2 ceph-mon[76053]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:04:37 compute-2 sudo[182106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:04:37 compute-2 sudo[182106]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:37 compute-2 sudo[182106]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:37 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:37 compute-2 python3.9[182105]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:04:37 compute-2 sudo[182131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:04:37 compute-2 sudo[182131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:37 compute-2 sudo[182103]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:38 compute-2 sudo[182131]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:38 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:38 compute-2 sudo[182337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqdrwuroinxolydxmlpktxxzedxwylti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583478.2939494-1624-209233100496081/AnsiballZ_stat.py'
Dec 01 10:04:38 compute-2 sudo[182337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:04:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:04:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:38 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:38 compute-2 python3.9[182339]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:38 compute-2 sudo[182337]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:39.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:39.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:39 compute-2 sudo[182485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwmsbocmkrhatnihbvdcuvmeklkiikbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583478.2939494-1624-209233100496081/AnsiballZ_copy.py'
Dec 01 10:04:39 compute-2 sudo[182485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:39 compute-2 sudo[182447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:04:39 compute-2 sudo[182447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:39 compute-2 sudo[182447]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:39 compute-2 ceph-mon[76053]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:04:39 compute-2 ceph-mon[76053]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:04:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:04:39 compute-2 python3.9[182490]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583478.2939494-1624-209233100496081/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:39 compute-2 sudo[182485]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:39 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:40 compute-2 sudo[182641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckorzovgtwsecrskruquctokwmjqhjmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583479.855772-1624-22080795560201/AnsiballZ_stat.py'
Dec 01 10:04:40 compute-2 sudo[182641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:40 compute-2 python3.9[182643]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:40 compute-2 sudo[182641]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:40 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:40 compute-2 sudo[182766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnxhjzetputnzkctefhyisqlyrwcqnux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583479.855772-1624-22080795560201/AnsiballZ_copy.py'
Dec 01 10:04:40 compute-2 sudo[182766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:40 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:40 compute-2 python3.9[182768]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583479.855772-1624-22080795560201/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:40 compute-2 sudo[182766]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:41.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:41 compute-2 sudo[182919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwwqbolbzncpnrmbdatqsomcutjljypo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583480.9894907-1624-24883430627237/AnsiballZ_stat.py'
Dec 01 10:04:41 compute-2 sudo[182919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:41 compute-2 python3.9[182921]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:41 compute-2 sudo[182919]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:41 compute-2 ceph-mon[76053]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:04:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:41 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:41 compute-2 sudo[183045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrdtgvfqxtljirxeckfsqpefnelfgnqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583480.9894907-1624-24883430627237/AnsiballZ_copy.py'
Dec 01 10:04:41 compute-2 sudo[183045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:42 compute-2 python3.9[183047]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583480.9894907-1624-24883430627237/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:42 compute-2 sudo[183045]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:42 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:42 compute-2 sudo[183197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inhrqleaiexygvhowfywsaqegecosxno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583482.215384-1624-12534480883211/AnsiballZ_stat.py'
Dec 01 10:04:42 compute-2 sudo[183197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:42 compute-2 python3.9[183199]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:42 compute-2 sudo[183197]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:42 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:43 compute-2 sudo[183322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iifbjradexkkugfxbfdqvphbaablkzgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583482.215384-1624-12534480883211/AnsiballZ_copy.py'
Dec 01 10:04:43 compute-2 sudo[183322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:43.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:43 compute-2 python3.9[183324]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583482.215384-1624-12534480883211/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:43 compute-2 sudo[183322]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:43 compute-2 sudo[183476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnalkrxzwqcyrimcvqqdnjjoqnmwksnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583483.4228725-1624-72315256831391/AnsiballZ_stat.py'
Dec 01 10:04:43 compute-2 sudo[183476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:43 compute-2 ceph-mon[76053]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Dec 01 10:04:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:43 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:43 compute-2 python3.9[183478]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:43 compute-2 sudo[183479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:04:43 compute-2 sudo[183479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:43 compute-2 sudo[183476]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:43 compute-2 sudo[183479]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:44 compute-2 sudo[183626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goffgwvzwsbwtmlqamkivuhcmxoigpyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583483.4228725-1624-72315256831391/AnsiballZ_copy.py'
Dec 01 10:04:44 compute-2 sudo[183626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:44 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:44 compute-2 python3.9[183628]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583483.4228725-1624-72315256831391/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:44 compute-2 sudo[183626]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:44 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:04:44 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:04:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:44 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:45 compute-2 sudo[183778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tugdpatmqplntognmgufxvrtuffpsphg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583484.7309415-1624-171707453779614/AnsiballZ_stat.py'
Dec 01 10:04:45 compute-2 sudo[183778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:45.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:45 compute-2 python3.9[183780]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:45 compute-2 sudo[183778]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:45 compute-2 sudo[183905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nctpaceqewnsdvjiqnuimxnhsosmkwze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583484.7309415-1624-171707453779614/AnsiballZ_copy.py'
Dec 01 10:04:45 compute-2 sudo[183905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:45 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:45 compute-2 ceph-mon[76053]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Dec 01 10:04:45 compute-2 python3.9[183907]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583484.7309415-1624-171707453779614/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:45 compute-2 sudo[183905]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:46 compute-2 sudo[184057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwsleyveynnokanpyoitmfjrwxzkfhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583485.9746032-1624-143299235535133/AnsiballZ_stat.py'
Dec 01 10:04:46 compute-2 sudo[184057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:46 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:46 compute-2 python3.9[184059]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:46 compute-2 sudo[184057]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:46 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:46 compute-2 sudo[184180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytxokkylzmsippyjbjdrveoktbbzdwpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583485.9746032-1624-143299235535133/AnsiballZ_copy.py'
Dec 01 10:04:46 compute-2 sudo[184180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:47 compute-2 python3.9[184182]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583485.9746032-1624-143299235535133/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:47 compute-2 sudo[184180]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:47.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:47.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:47 compute-2 sudo[184334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvlmszyqmwwtxkolfbplvbvsgpocjwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583487.2163193-1624-225092028279596/AnsiballZ_stat.py'
Dec 01 10:04:47 compute-2 sudo[184334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:47 compute-2 python3.9[184336]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:04:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:47 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:47 compute-2 sudo[184334]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:47 compute-2 ceph-mon[76053]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Dec 01 10:04:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:48 compute-2 sudo[184473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luugwsxppzkmdldqselhvcuodpekjxtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583487.2163193-1624-225092028279596/AnsiballZ_copy.py'
Dec 01 10:04:48 compute-2 sudo[184473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:48 compute-2 podman[184433]: 2025-12-01 10:04:48.169369299 +0000 UTC m=+0.099864831 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 10:04:48 compute-2 python3.9[184479]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583487.2163193-1624-225092028279596/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:48 compute-2 sudo[184473]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:48 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:48 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:49.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:49.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:49 compute-2 sudo[184636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgkbhqtbgrwtampyclsltnfckxymlmsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583489.2258394-1964-44439711109997/AnsiballZ_command.py'
Dec 01 10:04:49 compute-2 sudo[184636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:49 compute-2 python3.9[184638]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 01 10:04:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:49 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:49 compute-2 ceph-mon[76053]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Dec 01 10:04:49 compute-2 sudo[184636]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:50 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:50 compute-2 sudo[184789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssikolgmqtdhkarwjfbwbrbhcchavbou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583490.1098924-1991-494586254739/AnsiballZ_file.py'
Dec 01 10:04:50 compute-2 sudo[184789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:50 compute-2 python3.9[184791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:50 compute-2 sudo[184789]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:50 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:51 compute-2 sudo[184941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgcfcantuaveicmqjxamijficdvcspxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583490.8200808-1991-48677244421002/AnsiballZ_file.py'
Dec 01 10:04:51 compute-2 sudo[184941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:51.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:51 compute-2 python3.9[184943]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:51 compute-2 sudo[184941]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:51 compute-2 sudo[185095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyvirajohrdhvcquercxgdlvnjlgqncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583491.4439642-1991-144657149293551/AnsiballZ_file.py'
Dec 01 10:04:51 compute-2 sudo[185095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:51 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:51 compute-2 ceph-mon[76053]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:51 compute-2 python3.9[185097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:51 compute-2 sudo[185095]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:52 compute-2 sudo[185247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcztlcaddnltzmlthetpzwvcdnwtbckm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583492.0713615-1991-131953135462193/AnsiballZ_file.py'
Dec 01 10:04:52 compute-2 sudo[185247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:52 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:52 compute-2 python3.9[185249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:52 compute-2 sudo[185247]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:52 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:53 compute-2 sudo[185399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-macpxkibdriohgjmbelqooccptwfpoyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583492.7310874-1991-222833411901755/AnsiballZ_file.py'
Dec 01 10:04:53 compute-2 sudo[185399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec 01 10:04:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:53.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec 01 10:04:53 compute-2 python3.9[185401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:53 compute-2 sudo[185399]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:53.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:53 compute-2 sudo[185553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtbuqdvhpnenskcqypquuzzvqkzcnmku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583493.3835864-1991-279515201581456/AnsiballZ_file.py'
Dec 01 10:04:53 compute-2 sudo[185553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:53 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:53 compute-2 python3.9[185555]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:53 compute-2 sudo[185553]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:54 compute-2 ceph-mon[76053]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:04:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:54 compute-2 sudo[185705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztxypgbbqhieejphwkwijoqdhqruvjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583494.0207622-1991-30346655874205/AnsiballZ_file.py'
Dec 01 10:04:54 compute-2 sudo[185705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:54 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:54 compute-2 python3.9[185707]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:54 compute-2 sudo[185705]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:54 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:54 compute-2 sudo[185858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpiarwpnkaltpibardramwouqcfhekdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583494.6708803-1991-276548510342660/AnsiballZ_file.py'
Dec 01 10:04:54 compute-2 sudo[185858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:04:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:55.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:04:55 compute-2 python3.9[185860]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:55 compute-2 sudo[185858]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:04:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:04:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:04:55 compute-2 sudo[186012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygfrcjpspkvwrgonrmbgscbprncuaxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583495.3161926-1991-149537490602345/AnsiballZ_file.py'
Dec 01 10:04:55 compute-2 sudo[186012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:55 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:55 compute-2 python3.9[186014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:55 compute-2 sudo[186012]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:56 compute-2 sudo[186164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmlmdyuqpcgmsjojaunkfdzhywijecf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583495.956108-1991-250459443298980/AnsiballZ_file.py'
Dec 01 10:04:56 compute-2 sudo[186164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:04:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:56 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:56 compute-2 python3.9[186166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:56 compute-2 sudo[186164]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:56 compute-2 ceph-mon[76053]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:56 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:56 compute-2 sudo[186317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoslttzdwmychxdcmyymbdetbgrhnipt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583496.5926173-1991-71989828144035/AnsiballZ_file.py'
Dec 01 10:04:56 compute-2 sudo[186317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:57 compute-2 python3.9[186319]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:57 compute-2 sudo[186317]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:57.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:57 compute-2 sudo[186471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtjfqvmvrhxihgervarydpqmcmlcnevf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583497.2313676-1991-164297890243067/AnsiballZ_file.py'
Dec 01 10:04:57 compute-2 sudo[186471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:57 compute-2 ceph-mon[76053]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:04:57 compute-2 python3.9[186473]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:57 compute-2 sudo[186471]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:57 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:58 compute-2 sudo[186623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxaqpryadclyxqtpzqhhrdfoslgnlnvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583497.8841004-1991-33677336912751/AnsiballZ_file.py'
Dec 01 10:04:58 compute-2 sudo[186623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:58 compute-2 python3.9[186625]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:58 compute-2 sudo[186623]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:58 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:58 compute-2 sudo[186775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbbydjppapgpdlotxbhxvgsrpdwrbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583498.5225291-1991-111939766973530/AnsiballZ_file.py'
Dec 01 10:04:58 compute-2 sudo[186775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:04:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:58 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:59 compute-2 python3.9[186777]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:04:59 compute-2 sudo[186775]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:04:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:59.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:59 compute-2 ceph-mon[76053]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:04:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:04:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:04:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:59.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:04:59 compute-2 sudo[186805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:04:59 compute-2 sudo[186805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:04:59 compute-2 sudo[186805]: pam_unix(sudo:session): session closed for user root
Dec 01 10:04:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:59 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.785581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499785935, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4295, "num_deletes": 502, "total_data_size": 11684104, "memory_usage": 11864280, "flush_reason": "Manual Compaction"}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499834749, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4370129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13300, "largest_seqno": 17590, "table_properties": {"data_size": 4358823, "index_size": 6328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30747, "raw_average_key_size": 19, "raw_value_size": 4331874, "raw_average_value_size": 2805, "num_data_blocks": 275, "num_entries": 1544, "num_filter_entries": 1544, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583099, "oldest_key_time": 1764583099, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 49206 microseconds, and 12676 cpu microseconds.
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.834835) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4370129 bytes OK
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.834861) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836806) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836821) EVENT_LOG_v1 {"time_micros": 1764583499836817, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836841) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11665189, prev total WAL file size 11665189, number of live WAL files 2.
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.839951) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353032' seq:0, type:0; will stop at (end)
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4267KB)], [27(13MB)]
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499840095, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18164036, "oldest_snapshot_seqno": -1}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5007 keys, 13509112 bytes, temperature: kUnknown
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499937516, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13509112, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13474130, "index_size": 21368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 125413, "raw_average_key_size": 25, "raw_value_size": 13381695, "raw_average_value_size": 2672, "num_data_blocks": 892, "num_entries": 5007, "num_filter_entries": 5007, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.937879) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13509112 bytes
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.939410) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.2 rd, 138.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.2 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(7.2) write-amplify(3.1) OK, records in: 5836, records dropped: 829 output_compression: NoCompression
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.939451) EVENT_LOG_v1 {"time_micros": 1764583499939427, "job": 14, "event": "compaction_finished", "compaction_time_micros": 97572, "compaction_time_cpu_micros": 33290, "output_level": 6, "num_output_files": 1, "total_output_size": 13509112, "num_input_records": 5836, "num_output_records": 5007, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499940603, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499943347, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.839647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:04:59 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:00 compute-2 sudo[186955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltjtzpgwoyxfaixmjgupvfzcnabyxglb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583500.0319586-2287-116197897857100/AnsiballZ_stat.py'
Dec 01 10:05:00 compute-2 sudo[186955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:00 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc038000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:00 compute-2 python3.9[186957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:00 compute-2 sudo[186955]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:00 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:00 compute-2 sudo[187078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acffwqaoirkqljxvvscfkbesokafpdxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583500.0319586-2287-116197897857100/AnsiballZ_copy.py'
Dec 01 10:05:00 compute-2 sudo[187078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:01 compute-2 python3.9[187080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583500.0319586-2287-116197897857100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:01.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:01 compute-2 sudo[187078]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:01.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:01 compute-2 sudo[187232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzdewwwiabvjpqhqkyntevfrkuoauxnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583501.2807972-2287-132909672446147/AnsiballZ_stat.py'
Dec 01 10:05:01 compute-2 sudo[187232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:01 compute-2 sshd-session[185831]: Invalid user ir from 45.78.219.119 port 45228
Dec 01 10:05:01 compute-2 python3.9[187234]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:01 compute-2 sudo[187232]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:01 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:01 compute-2 ceph-mon[76053]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:05:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:02 compute-2 sudo[187355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btgkhcxjmtiowbfmkvifovqdtvmaglln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583501.2807972-2287-132909672446147/AnsiballZ_copy.py'
Dec 01 10:05:02 compute-2 sudo[187355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:02 compute-2 python3.9[187357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583501.2807972-2287-132909672446147/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:02 compute-2 sudo[187355]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:02 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:02 compute-2 sudo[187507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxwbmnjgvlenzvyoludcbfegbyywtpos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583502.4093604-2287-201524485847991/AnsiballZ_stat.py'
Dec 01 10:05:02 compute-2 sudo[187507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:02 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:02 compute-2 python3.9[187509]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:02 compute-2 sudo[187507]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:03 compute-2 sudo[187631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snchiibtsuleatwrnlpkdmsgfwneyphz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583502.4093604-2287-201524485847991/AnsiballZ_copy.py'
Dec 01 10:05:03 compute-2 sudo[187631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:03 compute-2 python3.9[187633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583502.4093604-2287-201524485847991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:03 compute-2 sshd-session[185831]: Received disconnect from 45.78.219.119 port 45228:11: Bye Bye [preauth]
Dec 01 10:05:03 compute-2 sshd-session[185831]: Disconnected from invalid user ir 45.78.219.119 port 45228 [preauth]
Dec 01 10:05:03 compute-2 sudo[187631]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:03 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:03 compute-2 sudo[187784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yutfcycypgaxzqclnlbfnngcsygbovjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583503.5824363-2287-126810174105926/AnsiballZ_stat.py'
Dec 01 10:05:03 compute-2 sudo[187784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:04 compute-2 python3.9[187786]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:04 compute-2 sudo[187784]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:04 compute-2 ceph-mon[76053]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:05:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:04 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:04 compute-2 sudo[187907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxgyfbghccofqnaqxjsyajibpiystfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583503.5824363-2287-126810174105926/AnsiballZ_copy.py'
Dec 01 10:05:04 compute-2 sudo[187907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:04 compute-2 python3.9[187909]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583503.5824363-2287-126810174105926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:04 compute-2 sudo[187907]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.688 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:05:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:05:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:05:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:04 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:05 compute-2 sudo[188059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hawmovnuywfkjnwqyxejhcwfjirmliqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583504.812595-2287-55766279443782/AnsiballZ_stat.py'
Dec 01 10:05:05 compute-2 sudo[188059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:05.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:05 compute-2 python3.9[188061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:05 compute-2 sudo[188059]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:05.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:05 compute-2 sudo[188184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wabokvfzhfzbrubyhbgifxomwzrxxzfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583504.812595-2287-55766279443782/AnsiballZ_copy.py'
Dec 01 10:05:05 compute-2 sudo[188184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:05 compute-2 ceph-mon[76053]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:05:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:05 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:05 compute-2 python3.9[188186]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583504.812595-2287-55766279443782/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:05 compute-2 sudo[188184]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:06 compute-2 sudo[188336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kskqacqzhyccouyncxhegglbfrnnycdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583505.9806643-2287-160000552283716/AnsiballZ_stat.py'
Dec 01 10:05:06 compute-2 sudo[188336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:06 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:06 compute-2 python3.9[188338]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:06 compute-2 sudo[188336]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:06 compute-2 kernel: ganesha.nfsd[180319]: segfault at 50 ip 00007fc11634732e sp 00007fc0c9ffa210 error 4 in libntirpc.so.5.8[7fc11632c000+2c000] likely on CPU 5 (core 0, socket 5)
Dec 01 10:05:06 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:05:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:06 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy ignored for local
Dec 01 10:05:06 compute-2 sudo[188459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foiwwsfxkivrqnuvupwfszxtteqmvjiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583505.9806643-2287-160000552283716/AnsiballZ_copy.py'
Dec 01 10:05:06 compute-2 sudo[188459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:06 compute-2 systemd[1]: Started Process Core Dump (PID 188461/UID 0).
Dec 01 10:05:07 compute-2 python3.9[188462]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583505.9806643-2287-160000552283716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:07 compute-2 sudo[188459]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:07.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:07 compute-2 ceph-mon[76053]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:05:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:07.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:07 compute-2 podman[188565]: 2025-12-01 10:05:07.411888214 +0000 UTC m=+0.060771078 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:05:07 compute-2 sudo[188633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksjhdysofbqfxbgcykkeyzybhtjkpids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583507.1973305-2287-30154345540637/AnsiballZ_stat.py'
Dec 01 10:05:07 compute-2 sudo[188633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:07 compute-2 python3.9[188635]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:07 compute-2 sudo[188633]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:08 compute-2 sudo[188756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxcewyivqqzzyxzpgclbpkitzhpfxfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583507.1973305-2287-30154345540637/AnsiballZ_copy.py'
Dec 01 10:05:08 compute-2 sudo[188756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:08 compute-2 python3.9[188758]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583507.1973305-2287-30154345540637/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:08 compute-2 sudo[188756]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:08 compute-2 systemd-coredump[188463]: Process 178300 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007fc11634732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:05:08 compute-2 systemd[1]: systemd-coredump@10-188461-0.service: Deactivated successfully.
Dec 01 10:05:08 compute-2 systemd[1]: systemd-coredump@10-188461-0.service: Consumed 1.612s CPU time.
Dec 01 10:05:08 compute-2 podman[188889]: 2025-12-01 10:05:08.631172371 +0000 UTC m=+0.024049203 container died 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:05:08 compute-2 sudo[188926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilpabpgzbqrrkfygsctzovcoqtevfdbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583508.3851192-2287-231953358147563/AnsiballZ_stat.py'
Dec 01 10:05:08 compute-2 sudo[188926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:08 compute-2 systemd[1]: var-lib-containers-storage-overlay-092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261-merged.mount: Deactivated successfully.
Dec 01 10:05:08 compute-2 podman[188889]: 2025-12-01 10:05:08.674847297 +0000 UTC m=+0.067724129 container remove 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Dec 01 10:05:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:05:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:05:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.723s CPU time.
Dec 01 10:05:08 compute-2 python3.9[188930]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:08 compute-2 sudo[188926]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:09.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:09 compute-2 sudo[189081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nclskpzlnzdwnukhnpzyjapiupakyijg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583508.3851192-2287-231953358147563/AnsiballZ_copy.py'
Dec 01 10:05:09 compute-2 sudo[189081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:09 compute-2 python3.9[189083]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583508.3851192-2287-231953358147563/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:09 compute-2 sudo[189081]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:09 compute-2 ceph-mon[76053]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:05:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:05:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100509 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:05:09 compute-2 sudo[189234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zetswnmwsommzklgcjzazgeopdvnakmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583509.5145574-2287-76843772281069/AnsiballZ_stat.py'
Dec 01 10:05:09 compute-2 sudo[189234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:09 compute-2 python3.9[189236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:09 compute-2 sudo[189234]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:10 compute-2 sudo[189357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuisrjekdmritlatqwevccmwggwrrttw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583509.5145574-2287-76843772281069/AnsiballZ_copy.py'
Dec 01 10:05:10 compute-2 sudo[189357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:10 compute-2 python3.9[189359]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583509.5145574-2287-76843772281069/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:10 compute-2 sudo[189357]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:10 compute-2 sudo[189509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scywfzfgfgbutuvclwzqwxelqxmkrouy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583510.7047164-2287-940664176105/AnsiballZ_stat.py'
Dec 01 10:05:11 compute-2 sudo[189509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:11.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:11 compute-2 python3.9[189511]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:11 compute-2 sudo[189509]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:11.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:11 compute-2 sudo[189634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrytcnyahlyftelwjiglrlnarxwtptwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583510.7047164-2287-940664176105/AnsiballZ_copy.py'
Dec 01 10:05:11 compute-2 sudo[189634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:11 compute-2 ceph-mon[76053]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:05:11 compute-2 python3.9[189636]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583510.7047164-2287-940664176105/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:11 compute-2 sudo[189634]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:12 compute-2 sudo[189786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjbueurteudjcstebahaodmoutnscrfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583511.9075518-2287-93240122268257/AnsiballZ_stat.py'
Dec 01 10:05:12 compute-2 sudo[189786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:12 compute-2 python3.9[189788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:12 compute-2 sudo[189786]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:12 compute-2 sudo[189909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfewznlhborofrsclxueqtcrncbjdhog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583511.9075518-2287-93240122268257/AnsiballZ_copy.py'
Dec 01 10:05:12 compute-2 sudo[189909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100512 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:05:12 compute-2 python3.9[189911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583511.9075518-2287-93240122268257/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:12 compute-2 sudo[189909]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:13.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:13 compute-2 sudo[190063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhnkxkklmjpmuzngdmjmjptyifemymdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583513.0990312-2287-271839143864407/AnsiballZ_stat.py'
Dec 01 10:05:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:13 compute-2 sudo[190063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:13.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:13 compute-2 python3.9[190065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:13 compute-2 sudo[190063]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:13 compute-2 ceph-mon[76053]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:05:14 compute-2 sudo[190186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xviyeavbqanqpdlwipvkulshgbrnujqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583513.0990312-2287-271839143864407/AnsiballZ_copy.py'
Dec 01 10:05:14 compute-2 sudo[190186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:14 compute-2 python3.9[190188]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583513.0990312-2287-271839143864407/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:14 compute-2 sudo[190186]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:14 compute-2 sudo[190338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaooclmlyateritokwimhwxxyyerpxae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583514.3757308-2287-188271148932854/AnsiballZ_stat.py'
Dec 01 10:05:14 compute-2 sudo[190338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:14 compute-2 python3.9[190340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:14 compute-2 sudo[190338]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:15.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:15 compute-2 sudo[190462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdmamicrubewumsinnrqqcsgdvybxpbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583514.3757308-2287-188271148932854/AnsiballZ_copy.py'
Dec 01 10:05:15 compute-2 sudo[190462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:15.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:15 compute-2 python3.9[190464]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583514.3757308-2287-188271148932854/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:15 compute-2 sudo[190462]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:15 compute-2 ceph-mon[76053]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 10:05:15 compute-2 sudo[190615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwnispnpljbcxpcewavsxinuzwnqhgzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583515.5875833-2287-58724757611592/AnsiballZ_stat.py'
Dec 01 10:05:15 compute-2 sudo[190615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:16 compute-2 python3.9[190617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:16 compute-2 sudo[190615]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:16 compute-2 sudo[190738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-extnphuwqadaobbpbjpttmysukhwxkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583515.5875833-2287-58724757611592/AnsiballZ_copy.py'
Dec 01 10:05:16 compute-2 sudo[190738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:16 compute-2 python3.9[190740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583515.5875833-2287-58724757611592/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:16 compute-2 sudo[190738]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:05:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:17.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:05:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:17.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:17 compute-2 ceph-mon[76053]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:05:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:18 compute-2 podman[190784]: 2025-12-01 10:05:18.444808216 +0000 UTC m=+0.099326729 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec 01 10:05:18 compute-2 python3.9[190920]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:19 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 11.
Dec 01 10:05:19 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:05:19 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.723s CPU time.
Dec 01 10:05:19 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:05:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:19.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:19 compute-2 podman[191048]: 2025-12-01 10:05:19.310061818 +0000 UTC m=+0.040554811 container create 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:05:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:19 compute-2 podman[191048]: 2025-12-01 10:05:19.370164239 +0000 UTC m=+0.100657252 container init 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Dec 01 10:05:19 compute-2 podman[191048]: 2025-12-01 10:05:19.374882546 +0000 UTC m=+0.105375549 container start 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 10:05:19 compute-2 bash[191048]: 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018
Dec 01 10:05:19 compute-2 podman[191048]: 2025-12-01 10:05:19.29227251 +0000 UTC m=+0.022765533 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:05:19 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:05:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:19.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:05:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:05:19 compute-2 sudo[191179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doknsevyqtqdgbpvtvnsjccwyolkkurl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583519.1228065-2906-191964714042966/AnsiballZ_seboolean.py'
Dec 01 10:05:19 compute-2 sudo[191179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:19 compute-2 sudo[191180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:05:19 compute-2 sudo[191180]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:19 compute-2 sudo[191180]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:19 compute-2 python3.9[191193]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 01 10:05:19 compute-2 ceph-mon[76053]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 10:05:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:21 compute-2 sudo[191179]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:21.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:21 compute-2 ceph-mon[76053]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 10:05:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:21.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:21 compute-2 sudo[191362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eplumreddekswnkooawdpnzchtnekxbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583521.5534294-2929-200622868208166/AnsiballZ_copy.py'
Dec 01 10:05:21 compute-2 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 01 10:05:21 compute-2 sudo[191362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:22 compute-2 python3.9[191364]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:22 compute-2 sudo[191362]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:22 compute-2 sudo[191514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldmkorfeiiwfwxwihqzhankdegzujoox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583522.1677186-2929-226149533207970/AnsiballZ_copy.py'
Dec 01 10:05:22 compute-2 sudo[191514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:22 compute-2 python3.9[191516]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:22 compute-2 sudo[191514]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:23 compute-2 sudo[191666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaalljaesopmfcxhyocbymxriacwnmqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583522.7629786-2929-146315864689562/AnsiballZ_copy.py'
Dec 01 10:05:23 compute-2 sudo[191666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:23.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:23 compute-2 python3.9[191668]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:23 compute-2 sudo[191666]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:23 compute-2 sudo[191820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xojugzhzwtuhfrvnqqwsanvcgeywaedx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583523.3840573-2929-56930912820369/AnsiballZ_copy.py'
Dec 01 10:05:23 compute-2 sudo[191820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:23 compute-2 ceph-mon[76053]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Dec 01 10:05:23 compute-2 python3.9[191822]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:23 compute-2 sudo[191820]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:24 compute-2 sudo[191972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmlujuyjrrsybibssvnmzzwmqaaphcwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583523.964535-2929-57349301790541/AnsiballZ_copy.py'
Dec 01 10:05:24 compute-2 sudo[191972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:24 compute-2 python3.9[191974]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:24 compute-2 sudo[191972]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:05:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:25.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:25.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:25 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:05:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:25 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:05:25 compute-2 sudo[192126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcfuvcypbrbxztwbswttykqcvixnzdke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583525.3681707-3037-182299039637233/AnsiballZ_copy.py'
Dec 01 10:05:25 compute-2 sudo[192126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:25 compute-2 ceph-mon[76053]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Dec 01 10:05:25 compute-2 python3.9[192128]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:25 compute-2 sudo[192126]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:26 compute-2 sudo[192278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ostvsetkskzlmbupzdtktfeiaqxfrxdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583525.980737-3037-179745433180724/AnsiballZ_copy.py'
Dec 01 10:05:26 compute-2 sudo[192278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:26 compute-2 python3.9[192280]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:26 compute-2 sudo[192278]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:26 compute-2 sudo[192430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeamgxizaoqurqwmnkzxhdmpbncxfioj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583526.5723372-3037-259588998667351/AnsiballZ_copy.py'
Dec 01 10:05:26 compute-2 sudo[192430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:27 compute-2 python3.9[192432]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:27 compute-2 sudo[192430]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:27.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:27 compute-2 sudo[192584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvzmnlcgbkosfnllayxoygzekgskfwlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583527.1696742-3037-129372978536074/AnsiballZ_copy.py'
Dec 01 10:05:27 compute-2 sudo[192584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:27 compute-2 python3.9[192586]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:27 compute-2 sudo[192584]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:27 compute-2 ceph-mon[76053]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:05:28 compute-2 sudo[192736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faibzgqzectaadnuifztsqwgcwasdfam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583527.7599964-3037-164602807355451/AnsiballZ_copy.py'
Dec 01 10:05:28 compute-2 sudo[192736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:28 compute-2 python3.9[192738]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:28 compute-2 sudo[192736]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:29.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:29 compute-2 sudo[192890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xftzbywrovncrtpekznrnbzsukelektx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583529.0618544-3147-135707673040514/AnsiballZ_systemd.py'
Dec 01 10:05:29 compute-2 sudo[192890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:29.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:29 compute-2 python3.9[192892]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:05:29 compute-2 systemd[1]: Reloading.
Dec 01 10:05:29 compute-2 systemd-rc-local-generator[192918]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:05:29 compute-2 systemd-sysv-generator[192923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:05:29 compute-2 ceph-mon[76053]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:05:30 compute-2 systemd[1]: Starting libvirt logging daemon socket...
Dec 01 10:05:30 compute-2 systemd[1]: Listening on libvirt logging daemon socket.
Dec 01 10:05:30 compute-2 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 01 10:05:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:30 compute-2 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 01 10:05:30 compute-2 systemd[1]: Starting libvirt logging daemon...
Dec 01 10:05:30 compute-2 systemd[1]: Started libvirt logging daemon.
Dec 01 10:05:30 compute-2 sudo[192890]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:30 compute-2 sudo[193083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czlogebapcwkzsbnjntrzwhhvgfwnvrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583530.3658774-3147-155543069790496/AnsiballZ_systemd.py'
Dec 01 10:05:30 compute-2 sudo[193083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:30 compute-2 python3.9[193085]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:05:30 compute-2 systemd[1]: Reloading.
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:31 compute-2 systemd-rc-local-generator[193113]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:05:31 compute-2 systemd-sysv-generator[193116]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:05:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:31.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:31 compute-2 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 01 10:05:31 compute-2 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 01 10:05:31 compute-2 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 01 10:05:31 compute-2 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 01 10:05:31 compute-2 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 01 10:05:31 compute-2 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 01 10:05:31 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Dec 01 10:05:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:31.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:31 compute-2 systemd[1]: Started libvirt nodedev daemon.
Dec 01 10:05:31 compute-2 sudo[193083]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100531 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:05:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a64000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:31 compute-2 sudo[193315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-schfoonwyjogcaudltvdnnmtnwbaufnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583531.595601-3147-101450709900377/AnsiballZ_systemd.py'
Dec 01 10:05:31 compute-2 sudo[193315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:31 compute-2 ceph-mon[76053]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:05:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:32 compute-2 python3.9[193317]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:05:32 compute-2 systemd[1]: Reloading.
Dec 01 10:05:32 compute-2 systemd-sysv-generator[193347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:05:32 compute-2 systemd-rc-local-generator[193343]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:05:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:32 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a500016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:32 compute-2 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 01 10:05:32 compute-2 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 01 10:05:32 compute-2 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 01 10:05:32 compute-2 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 01 10:05:32 compute-2 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 01 10:05:32 compute-2 systemd[1]: Starting libvirt proxy daemon...
Dec 01 10:05:32 compute-2 systemd[1]: Started libvirt proxy daemon.
Dec 01 10:05:32 compute-2 sudo[193315]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:32 compute-2 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 01 10:05:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:32 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a40000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:32 compute-2 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 01 10:05:33 compute-2 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 01 10:05:33 compute-2 sudo[193530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfjgymvcuvwpdkwiagxlojaufiyejlkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583532.7353835-3147-31851291299248/AnsiballZ_systemd.py'
Dec 01 10:05:33 compute-2 sudo[193530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:33.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:33 compute-2 python3.9[193536]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:05:33 compute-2 systemd[1]: Reloading.
Dec 01 10:05:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:33 compute-2 systemd-rc-local-generator[193566]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:05:33 compute-2 systemd-sysv-generator[193569]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:05:33 compute-2 systemd[1]: Listening on libvirt locking daemon socket.
Dec 01 10:05:33 compute-2 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 01 10:05:33 compute-2 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 01 10:05:33 compute-2 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 01 10:05:33 compute-2 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 01 10:05:33 compute-2 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 01 10:05:33 compute-2 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 01 10:05:33 compute-2 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 01 10:05:33 compute-2 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 01 10:05:33 compute-2 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 01 10:05:33 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Dec 01 10:05:33 compute-2 systemd[1]: Started libvirt QEMU daemon.
Dec 01 10:05:33 compute-2 sudo[193530]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:33 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a5c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:33 compute-2 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a2f286d8-11e6-41b7-8ed6-3fd1e2c7468d
Dec 01 10:05:33 compute-2 ceph-mon[76053]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:05:33 compute-2 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 01 10:05:33 compute-2 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a2f286d8-11e6-41b7-8ed6-3fd1e2c7468d
Dec 01 10:05:34 compute-2 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 01 10:05:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:34 compute-2 sudo[193753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzoqokwfitevqtytzgfvlvhtogcjicxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583533.9265978-3147-169353268566242/AnsiballZ_systemd.py'
Dec 01 10:05:34 compute-2 sudo[193753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:34 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a44000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:34 compute-2 python3.9[193755]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:05:34 compute-2 systemd[1]: Reloading.
Dec 01 10:05:34 compute-2 systemd-rc-local-generator[193782]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:05:34 compute-2 systemd-sysv-generator[193785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:05:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100534 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:05:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:34 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:34 compute-2 systemd[1]: Starting libvirt secret daemon socket...
Dec 01 10:05:34 compute-2 systemd[1]: Listening on libvirt secret daemon socket.
Dec 01 10:05:34 compute-2 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 01 10:05:34 compute-2 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 01 10:05:34 compute-2 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 01 10:05:34 compute-2 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 01 10:05:34 compute-2 systemd[1]: Starting libvirt secret daemon...
Dec 01 10:05:34 compute-2 systemd[1]: Started libvirt secret daemon.
Dec 01 10:05:34 compute-2 sudo[193753]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:35.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:35.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:35 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:35 compute-2 auditd[703]: Audit daemon rotating log files
Dec 01 10:05:36 compute-2 ceph-mon[76053]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:05:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:36 compute-2 sudo[193967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpywjwzngfrxxzquzdgfaznbbsgfrexk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583535.9571435-3257-36380322012038/AnsiballZ_file.py'
Dec 01 10:05:36 compute-2 sudo[193967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:36 compute-2 python3.9[193969]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:36 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a5c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:36 compute-2 sudo[193967]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:36 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a44001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:36 compute-2 sudo[194119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsktonkadtutflnakyhnjjlhzuihwmqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583536.7267816-3281-14255482565507/AnsiballZ_find.py'
Dec 01 10:05:36 compute-2 sudo[194119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:37.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:37 compute-2 python3.9[194121]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 10:05:37 compute-2 sudo[194119]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:37.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:37 compute-2 sudo[194286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akppflovtixgvcakspfulkamqnlqepnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583537.517832-3305-18584342195450/AnsiballZ_command.py'
Dec 01 10:05:37 compute-2 sudo[194286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:37 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:37 compute-2 podman[194247]: 2025-12-01 10:05:37.845184302 +0000 UTC m=+0.083474348 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:05:37 compute-2 python3.9[194294]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:05:38 compute-2 sudo[194286]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:38 compute-2 ceph-mon[76053]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:05:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:38 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:05:38 compute-2 kernel: ganesha.nfsd[193178]: segfault at 50 ip 00007f8b1068032e sp 00007f8ad5ffa210 error 4 in libntirpc.so.5.8[7f8b10665000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 01 10:05:38 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:05:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:38 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy ignored for local
Dec 01 10:05:38 compute-2 python3.9[194450]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 10:05:38 compute-2 systemd[1]: Started Process Core Dump (PID 194452/UID 0).
Dec 01 10:05:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec 01 10:05:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:39.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec 01 10:05:39 compute-2 ceph-mon[76053]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Dec 01 10:05:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:39.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:39 compute-2 sudo[194606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:05:39 compute-2 sudo[194606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:39 compute-2 sudo[194606]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:39 compute-2 python3.9[194605]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:40 compute-2 systemd-coredump[194453]: Process 191068 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007f8b1068032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:05:40 compute-2 python3.9[194751]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583539.3492303-3361-243301307510653/.source.xml follow=False _original_basename=secret.xml.j2 checksum=b828192784cecb28a4416a509fc39e7cc46c1495 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:05:40 compute-2 systemd[1]: systemd-coredump@11-194452-0.service: Deactivated successfully.
Dec 01 10:05:40 compute-2 systemd[1]: systemd-coredump@11-194452-0.service: Consumed 1.376s CPU time.
Dec 01 10:05:40 compute-2 podman[194780]: 2025-12-01 10:05:40.436812807 +0000 UTC m=+0.034141582 container died 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:05:40 compute-2 systemd[1]: var-lib-containers-storage-overlay-ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d-merged.mount: Deactivated successfully.
Dec 01 10:05:40 compute-2 podman[194780]: 2025-12-01 10:05:40.474976528 +0000 UTC m=+0.072305283 container remove 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:05:40 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:05:40 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:05:40 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.611s CPU time.
Dec 01 10:05:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:41 compute-2 sudo[194948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ticyxtaotdpoyfwdfmxejjqhuixmhivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583540.8000472-3407-15394013209012/AnsiballZ_command.py'
Dec 01 10:05:41 compute-2 sudo[194948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:41.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:41 compute-2 python3.9[194950]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 365f19c2-81e5-5edd-b6b4-280555214d3a
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:05:41 compute-2 polkitd[43617]: Registered Authentication Agent for unix-process:194953:340316 (system bus name :1.1856 [pkttyagent --process 194953 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 10:05:41 compute-2 polkitd[43617]: Unregistered Authentication Agent for unix-process:194953:340316 (system bus name :1.1856, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 10:05:41 compute-2 ceph-mon[76053]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Dec 01 10:05:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:41 compute-2 polkitd[43617]: Registered Authentication Agent for unix-process:194952:340315 (system bus name :1.1857 [pkttyagent --process 194952 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 10:05:41 compute-2 polkitd[43617]: Unregistered Authentication Agent for unix-process:194952:340315 (system bus name :1.1857, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 10:05:41 compute-2 sudo[194948]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:41.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:42 compute-2 python3.9[195114]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.506642) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542506801, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 624, "num_deletes": 251, "total_data_size": 1178890, "memory_usage": 1200928, "flush_reason": "Manual Compaction"}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542513452, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 772229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17595, "largest_seqno": 18214, "table_properties": {"data_size": 769107, "index_size": 1094, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7040, "raw_average_key_size": 18, "raw_value_size": 762989, "raw_average_value_size": 2040, "num_data_blocks": 49, "num_entries": 374, "num_filter_entries": 374, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583501, "oldest_key_time": 1764583501, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 6844 microseconds, and 3114 cpu microseconds.
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.513517) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 772229 bytes OK
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.513536) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514569) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514587) EVENT_LOG_v1 {"time_micros": 1764583542514582, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514624) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1175447, prev total WAL file size 1175447, number of live WAL files 2.
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.515174) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(754KB)], [30(12MB)]
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542515267, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14281341, "oldest_snapshot_seqno": -1}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4871 keys, 12091486 bytes, temperature: kUnknown
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542579296, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12091486, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12058475, "index_size": 19717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123199, "raw_average_key_size": 25, "raw_value_size": 11969449, "raw_average_value_size": 2457, "num_data_blocks": 819, "num_entries": 4871, "num_filter_entries": 4871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.579830) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12091486 bytes
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.581252) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.0 rd, 187.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.9 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(34.2) write-amplify(15.7) OK, records in: 5381, records dropped: 510 output_compression: NoCompression
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.581292) EVENT_LOG_v1 {"time_micros": 1764583542581276, "job": 16, "event": "compaction_finished", "compaction_time_micros": 64338, "compaction_time_cpu_micros": 27695, "output_level": 6, "num_output_files": 1, "total_output_size": 12091486, "num_input_records": 5381, "num_output_records": 4871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542581625, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542584392, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.515074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:05:42 compute-2 sudo[195264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkgirxgevqxexgnhkqklmgnazypqvuso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583542.5335903-3455-202249779743510/AnsiballZ_command.py'
Dec 01 10:05:42 compute-2 sudo[195264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:43 compute-2 sshd-session[194296]: Connection reset by 147.185.132.88 port 59696 [preauth]
Dec 01 10:05:43 compute-2 sudo[195264]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:43.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:43.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:43 compute-2 sudo[195419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpfygwowxekjpkaewrauzyltiexdazhq ; FSID=365f19c2-81e5-5edd-b6b4-280555214d3a KEY=AQDkYy1pAAAAABAAkbJz0WufsOiiJsVlIdW4cg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583543.3382945-3478-78563938433683/AnsiballZ_command.py'
Dec 01 10:05:43 compute-2 sudo[195419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:43 compute-2 ceph-mon[76053]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Dec 01 10:05:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100543 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:05:43 compute-2 polkitd[43617]: Registered Authentication Agent for unix-process:195422:340571 (system bus name :1.1860 [pkttyagent --process 195422 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 01 10:05:43 compute-2 polkitd[43617]: Unregistered Authentication Agent for unix-process:195422:340571 (system bus name :1.1860, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 01 10:05:43 compute-2 sudo[195419]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:44 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 01 10:05:44 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.016s CPU time.
Dec 01 10:05:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:44 compute-2 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 01 10:05:44 compute-2 sudo[195452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:05:44 compute-2 sudo[195452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:44 compute-2 sudo[195452]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:44 compute-2 sudo[195477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:05:44 compute-2 sudo[195477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:44 compute-2 sudo[195656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrlonhanargopjyxjkginbouvduhstdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583544.3951824-3502-130855674719753/AnsiballZ_copy.py'
Dec 01 10:05:44 compute-2 sudo[195656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:44 compute-2 sudo[195477]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100544 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:05:44 compute-2 python3.9[195660]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:44 compute-2 sudo[195656]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:45.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:45.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:45 compute-2 sudo[195812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvccciparnaykradjacvgilexbvzcrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583545.2349188-3527-182576569955166/AnsiballZ_stat.py'
Dec 01 10:05:45 compute-2 sudo[195812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:45 compute-2 ceph-mon[76053]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:05:45 compute-2 python3.9[195814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:45 compute-2 sudo[195812]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:46 compute-2 sudo[195935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjeachlqjyliyciilfxxkspwanlkbcme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583545.2349188-3527-182576569955166/AnsiballZ_copy.py'
Dec 01 10:05:46 compute-2 sudo[195935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:46 compute-2 python3.9[195937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583545.2349188-3527-182576569955166/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:46 compute-2 sudo[195935]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:46 compute-2 sudo[196087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppbgtdsvzllanxrevdtwdfurfdkfabtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583546.6545446-3575-249482464524142/AnsiballZ_file.py'
Dec 01 10:05:46 compute-2 sudo[196087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:47 compute-2 python3.9[196089]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:47 compute-2 sudo[196087]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:47.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:47.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:47 compute-2 sudo[196241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbnloafcqybtwtxrfihxbtdxrlqxkqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583547.4446435-3599-145132194114163/AnsiballZ_stat.py'
Dec 01 10:05:47 compute-2 sudo[196241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:47 compute-2 ceph-mon[76053]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:05:47 compute-2 python3.9[196243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:47 compute-2 sudo[196241]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:48 compute-2 sudo[196319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dumvfftfkrskoqhlsssgyyitzqwujygr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583547.4446435-3599-145132194114163/AnsiballZ_file.py'
Dec 01 10:05:48 compute-2 sudo[196319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:48 compute-2 python3.9[196321]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:48 compute-2 sudo[196319]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:48 compute-2 sudo[196483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gatqbatpqopugqqnbuziwijgzkwwecoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583548.6909618-3635-214526870883372/AnsiballZ_stat.py'
Dec 01 10:05:48 compute-2 sudo[196483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:49 compute-2 podman[196445]: 2025-12-01 10:05:49.0427279 +0000 UTC m=+0.103201325 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 01 10:05:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:05:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:05:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:49.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:49 compute-2 python3.9[196491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:49 compute-2 sudo[196483]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:49.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:49 compute-2 sudo[196576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojmbxgpysmyllwqewaddqqzbskgijxyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583548.6909618-3635-214526870883372/AnsiballZ_file.py'
Dec 01 10:05:49 compute-2 sudo[196576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:49 compute-2 python3.9[196578]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1aucan99 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:49 compute-2 sudo[196576]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:50 compute-2 ceph-mon[76053]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 01 10:05:50 compute-2 ceph-mon[76053]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 97 B/s rd, 0 op/s
Dec 01 10:05:50 compute-2 ceph-mon[76053]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 121 B/s rd, 0 op/s
Dec 01 10:05:50 compute-2 sudo[196728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-triuziwvyugcnkwswnyjafzqjfxmagbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583549.9594958-3670-126672551316400/AnsiballZ_stat.py'
Dec 01 10:05:50 compute-2 sudo[196728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:50 compute-2 python3.9[196730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:50 compute-2 sudo[196728]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:50 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 12.
Dec 01 10:05:50 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:05:50 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.611s CPU time.
Dec 01 10:05:50 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:05:50 compute-2 sudo[196818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqdjksjwflbchmaismtaddjxycqrtuzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583549.9594958-3670-126672551316400/AnsiballZ_file.py'
Dec 01 10:05:50 compute-2 sudo[196818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:50 compute-2 podman[196853]: 2025-12-01 10:05:50.916966685 +0000 UTC m=+0.044203169 container create 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 01 10:05:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:05:50 compute-2 python3.9[196826]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:50 compute-2 podman[196853]: 2025-12-01 10:05:50.981821884 +0000 UTC m=+0.109058368 container init 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 01 10:05:50 compute-2 podman[196853]: 2025-12-01 10:05:50.990108209 +0000 UTC m=+0.117344693 container start 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:05:50 compute-2 podman[196853]: 2025-12-01 10:05:50.896552773 +0000 UTC m=+0.023789277 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:05:50 compute-2 bash[196853]: 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:05:51 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:05:51 compute-2 sudo[196818]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:05:51 compute-2 ceph-mon[76053]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 121 B/s wr, 0 op/s
Dec 01 10:05:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:51.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:51.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:51 compute-2 sudo[197062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwiuwhkvxbodowauomujvfbclnknxgla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583551.2739298-3710-134506917987400/AnsiballZ_command.py'
Dec 01 10:05:51 compute-2 sudo[197062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:51 compute-2 python3.9[197064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:05:51 compute-2 sudo[197062]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:52 compute-2 sudo[197215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhoofoapenogikgfpiizgxltaawfsluh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583552.0395856-3734-704355545970/AnsiballZ_edpm_nftables_from_files.py'
Dec 01 10:05:52 compute-2 sudo[197215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:52 compute-2 python3[197217]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 01 10:05:52 compute-2 sudo[197215]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:53.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:53 compute-2 sudo[197368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcecwhavrrcssglehfaoedcigagvxal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583552.9660032-3758-25970643330861/AnsiballZ_stat.py'
Dec 01 10:05:53 compute-2 sudo[197368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:53.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:53 compute-2 python3.9[197370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:53 compute-2 sudo[197368]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:53 compute-2 sudo[197447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emgjkltdqbpagqklfhicjtewqhatkrvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583552.9660032-3758-25970643330861/AnsiballZ_file.py'
Dec 01 10:05:53 compute-2 sudo[197447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:53 compute-2 python3.9[197449]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:53 compute-2 sudo[197447]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:54 compute-2 ceph-mon[76053]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 121 B/s wr, 0 op/s
Dec 01 10:05:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:05:54 compute-2 sudo[197450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:05:54 compute-2 sudo[197450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:54 compute-2 sudo[197450]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:54 compute-2 sudo[197624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbphjvcpbbhifcsnwsnxsymsjyulzvsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583554.2769938-3794-271746190414858/AnsiballZ_stat.py'
Dec 01 10:05:54 compute-2 sudo[197624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:54 compute-2 python3.9[197626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:54 compute-2 sudo[197624]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:05:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:55 compute-2 sudo[197702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjscssogtimsuvmnotpifmajlfqsrnfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583554.2769938-3794-271746190414858/AnsiballZ_file.py'
Dec 01 10:05:55 compute-2 sudo[197702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:55.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:55 compute-2 python3.9[197704]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:55 compute-2 sudo[197702]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:55.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:55 compute-2 sudo[197856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiywjhfllzyhkaalcxgyjmsehgthexur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583555.5812454-3830-150271399410851/AnsiballZ_stat.py'
Dec 01 10:05:55 compute-2 sudo[197856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:56 compute-2 ceph-mon[76053]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 121 B/s wr, 0 op/s
Dec 01 10:05:56 compute-2 python3.9[197858]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:56 compute-2 sudo[197856]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:56 compute-2 sudo[197934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwijxzttruvdxugsvjqnxxuiqsiqabnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583555.5812454-3830-150271399410851/AnsiballZ_file.py'
Dec 01 10:05:56 compute-2 sudo[197934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:05:56 compute-2 python3.9[197936]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:56 compute-2 sudo[197934]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:57.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:05:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:05:57 compute-2 ceph-mon[76053]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 847 B/s wr, 2 op/s
Dec 01 10:05:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:57.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:57 compute-2 sudo[198088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtmqlspzkjdixtdtciqotfhofegushjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583557.3995092-3866-224534309538846/AnsiballZ_stat.py'
Dec 01 10:05:57 compute-2 sudo[198088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:57 compute-2 python3.9[198090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:57 compute-2 sudo[198088]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:58 compute-2 sudo[198166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hykccfrstfvrypqztramvbcyvjqluhnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583557.3995092-3866-224534309538846/AnsiballZ_file.py'
Dec 01 10:05:58 compute-2 sudo[198166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:58 compute-2 python3.9[198168]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:58 compute-2 sudo[198166]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:05:59 compute-2 sudo[198318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agglvvyspwgfxrnausxzkiuzbmjejgzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583558.6791708-3902-219949368460740/AnsiballZ_stat.py'
Dec 01 10:05:59 compute-2 sudo[198318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:05:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:59.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:05:59 compute-2 python3.9[198320]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:05:59 compute-2 sudo[198318]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:05:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:05:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:05:59 compute-2 sudo[198445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voawyibedzbkdaqvxcsinignqhlyjpft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583558.6791708-3902-219949368460740/AnsiballZ_copy.py'
Dec 01 10:05:59 compute-2 sudo[198445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:05:59 compute-2 sudo[198448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:05:59 compute-2 sudo[198448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:05:59 compute-2 python3.9[198447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583558.6791708-3902-219949368460740/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:05:59 compute-2 sudo[198448]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:59 compute-2 sudo[198445]: pam_unix(sudo:session): session closed for user root
Dec 01 10:05:59 compute-2 ceph-mon[76053]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 716 B/s wr, 2 op/s
Dec 01 10:06:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:00 compute-2 sudo[198622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndoamvoynnhluxjplbgcssnvneqaqbmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583560.164025-3947-33969298500839/AnsiballZ_file.py'
Dec 01 10:06:00 compute-2 sudo[198622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:00 compute-2 python3.9[198624]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:00 compute-2 sudo[198622]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:01 compute-2 ceph-mon[76053]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:06:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:01.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:01 compute-2 sudo[198776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqrgomimvaopaadyaebjdbktligiludj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583560.9551704-3971-241082515549183/AnsiballZ_command.py'
Dec 01 10:06:01 compute-2 sudo[198776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:01.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:01 compute-2 python3.9[198778]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:01 compute-2 sudo[198776]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:02 compute-2 sudo[198931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbawdkklazwronavgxmorcervcjkuqhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583562.0256732-3995-109326103430522/AnsiballZ_blockinfile.py'
Dec 01 10:06:02 compute-2 sudo[198931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:02 compute-2 python3.9[198933]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:02 compute-2 sudo[198931]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:03.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:06:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:03.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:03 compute-2 sudo[199098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puhehtsgzcdhjrotvdjiyjonmnvmwjwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583563.2039437-4022-162086781749948/AnsiballZ_command.py'
Dec 01 10:06:03 compute-2 sudo[199098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:03 compute-2 python3.9[199100]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:03 compute-2 sudo[199098]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:03 compute-2 ceph-mon[76053]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:06:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:04 compute-2 sudo[199254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giepemhqqrvovadfiipdfoddsjpbpcjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583564.1028588-4046-239792365130098/AnsiballZ_stat.py'
Dec 01 10:06:04 compute-2 sudo[199254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:04 compute-2 python3.9[199256]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:06:04 compute-2 sudo[199254]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:06:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:06:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:06:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:05.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:05 compute-2 sudo[199409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiehkgixdplnbeladztzgagezjhranuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583564.9629307-4070-122201882064203/AnsiballZ_command.py'
Dec 01 10:06:05 compute-2 sudo[199409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:05 compute-2 python3.9[199411]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:05.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:05 compute-2 sudo[199409]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100605 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:06:06 compute-2 ceph-mon[76053]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:06:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:06 compute-2 sudo[199565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrtqotebqgbdwrigdimmnkhrzykpewd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583565.7951624-4094-280993085658974/AnsiballZ_file.py'
Dec 01 10:06:06 compute-2 sudo[199565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:06 compute-2 python3.9[199567]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:06 compute-2 sudo[199565]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:06 compute-2 sudo[199717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdnjpyprfghuxeoczhhtrnahqmmtgqsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583566.5453138-4118-221862500014079/AnsiballZ_stat.py'
Dec 01 10:06:06 compute-2 sudo[199717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100606 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:06:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:07 compute-2 python3.9[199719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:07 compute-2 sudo[199717]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:07.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:07 compute-2 sudo[199842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxhbfdrdkztgwpshjolwekrmyojwjkcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583566.5453138-4118-221862500014079/AnsiballZ_copy.py'
Dec 01 10:06:07 compute-2 sudo[199842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:07.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:07 compute-2 python3.9[199844]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583566.5453138-4118-221862500014079/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:07 compute-2 sudo[199842]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:08 compute-2 ceph-mon[76053]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 01 10:06:08 compute-2 sudo[200011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzbxbushimzggozvtirncautrxwdlkjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583567.9281168-4163-109050828688019/AnsiballZ_stat.py'
Dec 01 10:06:08 compute-2 sudo[200011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:08 compute-2 podman[199968]: 2025-12-01 10:06:08.25166908 +0000 UTC m=+0.057517779 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 01 10:06:08 compute-2 python3.9[200016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:08 compute-2 sudo[200011]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:08 compute-2 sudo[200137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaxjyekmpzrugdvmqnzkdslhdldqwpue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583567.9281168-4163-109050828688019/AnsiballZ_copy.py'
Dec 01 10:06:08 compute-2 sudo[200137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:08 compute-2 python3.9[200139]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583567.9281168-4163-109050828688019/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:08 compute-2 sudo[200137]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:09 compute-2 ceph-mon[76053]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 511 B/s wr, 2 op/s
Dec 01 10:06:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:09.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:09.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:06:10 compute-2 sudo[200291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpshvatagaajqkbswylbnjcvodbzwdrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583569.990369-4208-146461031395331/AnsiballZ_stat.py'
Dec 01 10:06:10 compute-2 sudo[200291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:10 compute-2 python3.9[200293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:10 compute-2 sudo[200291]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100610 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:06:10 compute-2 sudo[200414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbdlqnyvqwocjanwltnfvznyjxccznmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583569.990369-4208-146461031395331/AnsiballZ_copy.py'
Dec 01 10:06:10 compute-2 sudo[200414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:11 compute-2 python3.9[200416]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583569.990369-4208-146461031395331/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:11 compute-2 sudo[200414]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:11.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:11 compute-2 ceph-mon[76053]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 511 B/s wr, 2 op/s
Dec 01 10:06:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:11.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:11 compute-2 sudo[200568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmldboydyeslwvfjggekahumgoqtylnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583571.373376-4254-30742995889764/AnsiballZ_systemd.py'
Dec 01 10:06:11 compute-2 sudo[200568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:11 compute-2 python3.9[200570]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:06:11 compute-2 systemd[1]: Reloading.
Dec 01 10:06:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:12 compute-2 systemd-sysv-generator[200600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:06:12 compute-2 systemd-rc-local-generator[200597]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:06:12 compute-2 systemd[1]: Reached target edpm_libvirt.target.
Dec 01 10:06:12 compute-2 sudo[200568]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:12 compute-2 sudo[200758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sixitqnbusuwsddirwoutxzxhxeqlkpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583572.69311-4277-106252081344810/AnsiballZ_systemd.py'
Dec 01 10:06:12 compute-2 sudo[200758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:13.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:13 compute-2 python3.9[200760]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 01 10:06:13 compute-2 systemd[1]: Reloading.
Dec 01 10:06:13 compute-2 systemd-sysv-generator[200793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:06:13 compute-2 systemd-rc-local-generator[200790]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:06:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:13 compute-2 systemd[1]: Reloading.
Dec 01 10:06:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:13 compute-2 systemd-sysv-generator[200830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:06:13 compute-2 systemd-rc-local-generator[200826]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:06:13 compute-2 ceph-mon[76053]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:06:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:14 compute-2 sudo[200758]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:14 compute-2 sshd-session[142082]: Connection closed by 192.168.122.30 port 51380
Dec 01 10:06:14 compute-2 sshd-session[142079]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:06:14 compute-2 systemd[1]: session-52.scope: Deactivated successfully.
Dec 01 10:06:14 compute-2 systemd[1]: session-52.scope: Consumed 3min 36.779s CPU time.
Dec 01 10:06:14 compute-2 systemd-logind[795]: Session 52 logged out. Waiting for processes to exit.
Dec 01 10:06:14 compute-2 systemd-logind[795]: Removed session 52.
Dec 01 10:06:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:15.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:15 compute-2 ceph-mon[76053]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:06:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:15.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:15 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:06:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:17.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:06:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:17.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:17 compute-2 ceph-mon[76053]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:06:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:06:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:19.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:06:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:06:19 compute-2 podman[200865]: 2025-12-01 10:06:19.43678996 +0000 UTC m=+0.089447695 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:06:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:19.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:19 compute-2 sudo[200893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:06:19 compute-2 sudo[200893]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:06:19 compute-2 sudo[200893]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:19 compute-2 ceph-mon[76053]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:06:20 compute-2 sshd-session[200918]: Accepted publickey for zuul from 192.168.122.30 port 47414 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:06:20 compute-2 systemd-logind[795]: New session 53 of user zuul.
Dec 01 10:06:20 compute-2 systemd[1]: Started Session 53 of User zuul.
Dec 01 10:06:20 compute-2 sshd-session[200918]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:06:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:21 compute-2 python3.9[201071]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 10:06:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:21.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:21.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:22 compute-2 ceph-mon[76053]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:06:22 compute-2 python3.9[201227]: ansible-ansible.builtin.service_facts Invoked
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:22 compute-2 network[201244]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 10:06:22 compute-2 network[201245]: 'network-scripts' will be removed from distribution in near future.
Dec 01 10:06:22 compute-2 network[201246]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 10:06:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:23.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:23.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:23 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:24 compute-2 ceph-mon[76053]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:06:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:06:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:25.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:06:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:25.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:26 compute-2 ceph-mon[76053]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:06:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:27.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:27 compute-2 sudo[201522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdshspmrzuhxvorlclguuvpajszfqpna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583587.0911143-103-217105281646771/AnsiballZ_setup.py'
Dec 01 10:06:27 compute-2 sudo[201522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:27.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:27 compute-2 python3.9[201524]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 01 10:06:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:28 compute-2 sudo[201522]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:28 compute-2 ceph-mon[76053]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:06:28 compute-2 sudo[201606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsvvawelejizsqmacssxplqcuidrbqtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583587.0911143-103-217105281646771/AnsiballZ_dnf.py'
Dec 01 10:06:28 compute-2 sudo[201606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:28 compute-2 python3.9[201608]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 10:06:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:29 compute-2 ceph-mon[76053]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:06:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:29.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:29.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:29 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:31.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:31.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:31 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:31 compute-2 ceph-mon[76053]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:06:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100632 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 3ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:06:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:33.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:33.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:34 compute-2 ceph-mon[76053]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:06:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:34 compute-2 sudo[201606]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:35 compute-2 sudo[201768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uguasgonljxhjddwgvqrtmgcqlfscskq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583594.7777286-140-123418842856948/AnsiballZ_stat.py'
Dec 01 10:06:35 compute-2 sudo[201768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:35.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:35 compute-2 python3.9[201770]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:06:35 compute-2 sudo[201768]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:35.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:35 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:36 compute-2 ceph-mon[76053]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:06:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:36 compute-2 sudo[201921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twoomfkjdrfmtfrtwobckfhvqfhuhmuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583595.813817-170-32792746298108/AnsiballZ_command.py'
Dec 01 10:06:36 compute-2 sudo[201921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:36 compute-2 python3.9[201923]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:36 compute-2 sudo[201921]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:37.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:37 compute-2 sudo[202075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljnuheqfbgbxeysvbigeqgavvdtzctw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583596.9949493-200-193744344632833/AnsiballZ_stat.py'
Dec 01 10:06:37 compute-2 sudo[202075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:37 compute-2 python3.9[202077]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:06:37 compute-2 sudo[202075]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:37.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:37 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:37 compute-2 sudo[202228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrumnkfqdxzdxvcetaorcwfyhkrabwch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583597.7092426-224-28905888794992/AnsiballZ_command.py'
Dec 01 10:06:37 compute-2 sudo[202228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:38 compute-2 ceph-mon[76053]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:06:38 compute-2 python3.9[202230]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:38 compute-2 sudo[202228]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:38 compute-2 podman[202256]: 2025-12-01 10:06:38.397303121 +0000 UTC m=+0.056660395 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 01 10:06:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:38 compute-2 sudo[202400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioolaqtnorsexsozcrvtfqfwghwhkgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583598.4828885-248-213397658526625/AnsiballZ_stat.py'
Dec 01 10:06:38 compute-2 sudo[202400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:38 compute-2 python3.9[202402]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:38 compute-2 sudo[202400]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:39 compute-2 ceph-mon[76053]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:06:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:39.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:39 compute-2 sudo[202525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyzkzhyedwabtqvzgkogecagfnorbtek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583598.4828885-248-213397658526625/AnsiballZ_copy.py'
Dec 01 10:06:39 compute-2 sudo[202525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:39.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:39 compute-2 python3.9[202527]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583598.4828885-248-213397658526625/.source.iscsi _original_basename=.vu6x3ct6 follow=False checksum=4f1a924d28774906f3bfff690537c50ef0aff53c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:39 compute-2 sudo[202525]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:39 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:40 compute-2 sudo[202600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:06:40 compute-2 sudo[202600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:06:40 compute-2 sudo[202600]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:06:40 compute-2 sudo[202702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcjauykjtlmpacazjdmlvedgknyuyjnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583599.9146993-293-140516095781469/AnsiballZ_file.py'
Dec 01 10:06:40 compute-2 sudo[202702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:40 compute-2 python3.9[202704]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:40 compute-2 sudo[202702]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:41 compute-2 sudo[202855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyjygqstfijxbubepwkuzdpradhwepga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583600.8389404-317-44068529435302/AnsiballZ_lineinfile.py'
Dec 01 10:06:41 compute-2 ceph-mon[76053]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:06:41 compute-2 sudo[202855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:41 compute-2 python3.9[202857]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:41 compute-2 sudo[202855]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:42 compute-2 sudo[203008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqfovxqgrlqfbolsbxrohphavlqrpmbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583601.776913-344-2158151402497/AnsiballZ_systemd_service.py'
Dec 01 10:06:42 compute-2 sudo[203008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:42 compute-2 python3.9[203010]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:06:42 compute-2 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 01 10:06:42 compute-2 sudo[203008]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:43 compute-2 sudo[203166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsrjywsqwxhptqkwwgjjmouvyblbniub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583603.090407-368-62766669329571/AnsiballZ_systemd_service.py'
Dec 01 10:06:43 compute-2 sudo[203166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:43.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:43 compute-2 python3.9[203168]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:06:43 compute-2 systemd[1]: Reloading.
Dec 01 10:06:43 compute-2 systemd-rc-local-generator[203199]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:06:43 compute-2 systemd-sysv-generator[203202]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:06:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:43 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:44 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 01 10:06:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:44 compute-2 ceph-mon[76053]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:06:44 compute-2 systemd[1]: Starting Open-iSCSI...
Dec 01 10:06:44 compute-2 kernel: Loading iSCSI transport class v2.0-870.
Dec 01 10:06:44 compute-2 systemd[1]: Started Open-iSCSI.
Dec 01 10:06:44 compute-2 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 01 10:06:44 compute-2 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 01 10:06:44 compute-2 sudo[203166]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:45 compute-2 sudo[203369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kajcwdgbrdpjdsjefvoxycolwewuwffs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583604.8359647-401-91071778664692/AnsiballZ_service_facts.py'
Dec 01 10:06:45 compute-2 sudo[203369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:45 compute-2 ceph-mon[76053]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:06:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:45.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:45 compute-2 python3.9[203371]: ansible-ansible.builtin.service_facts Invoked
Dec 01 10:06:45 compute-2 network[203390]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 10:06:45 compute-2 network[203391]: 'network-scripts' will be removed from distribution in near future.
Dec 01 10:06:45 compute-2 network[203392]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 10:06:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:45.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:45 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:47.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:47.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:47 compute-2 ceph-mon[76053]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:06:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:48 compute-2 sudo[203369]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:49.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:49 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:50 compute-2 ceph-mon[76053]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:06:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:50 compute-2 podman[203541]: 2025-12-01 10:06:50.433031949 +0000 UTC m=+0.091359166 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 01 10:06:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:50 compute-2 sudo[203693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbxxcylqttfuchomptalycwfukvpeav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583610.5055943-431-117764861666342/AnsiballZ_file.py'
Dec 01 10:06:50 compute-2 sudo[203693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:51 compute-2 python3.9[203695]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 10:06:51 compute-2 sudo[203693]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:51.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:51.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:51 compute-2 sudo[203847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoctajhmlezpcpnhjckdxgirhyjsiroj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583611.2483685-454-207628396838173/AnsiballZ_modprobe.py'
Dec 01 10:06:51 compute-2 sudo[203847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:51 compute-2 python3.9[203849]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 01 10:06:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:51 compute-2 sudo[203847]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:52 compute-2 ceph-mon[76053]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:06:52 compute-2 sudo[204003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfwcsfbogncrevyhvyevrgdxfgyijlkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583612.12913-479-49394970071849/AnsiballZ_stat.py'
Dec 01 10:06:52 compute-2 sudo[204003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:52 compute-2 python3.9[204005]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:52 compute-2 sudo[204003]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:52 compute-2 sudo[204126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxpwczqnlotouaezmdpotegwigvuzmgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583612.12913-479-49394970071849/AnsiballZ_copy.py'
Dec 01 10:06:52 compute-2 sudo[204126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:53 compute-2 ceph-mon[76053]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:06:53 compute-2 python3.9[204128]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583612.12913-479-49394970071849/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:53 compute-2 sudo[204126]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:53.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:53.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:53 compute-2 sudo[204280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqgnbupdkwdylkbhctgysbiirfvqjrpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583613.5642233-527-46628943585744/AnsiballZ_lineinfile.py'
Dec 01 10:06:53 compute-2 sudo[204280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:53 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:54 compute-2 python3.9[204282]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:54 compute-2 sudo[204280]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:54 compute-2 sudo[204283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:06:54 compute-2 sudo[204283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:06:54 compute-2 sudo[204283]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:54 compute-2 sudo[204332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:06:54 compute-2 sudo[204332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:06:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:06:54 compute-2 sudo[204332]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:54 compute-2 sudo[204513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsngadyufrigvqdfdbsofrimjouxvrkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583614.3232381-551-160917134098126/AnsiballZ_systemd.py'
Dec 01 10:06:54 compute-2 sudo[204513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:55 compute-2 python3.9[204515]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:06:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:55.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:55 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 01 10:06:55 compute-2 systemd[1]: Stopped Load Kernel Modules.
Dec 01 10:06:55 compute-2 systemd[1]: Stopping Load Kernel Modules...
Dec 01 10:06:55 compute-2 systemd[1]: Starting Load Kernel Modules...
Dec 01 10:06:55 compute-2 systemd[1]: Finished Load Kernel Modules.
Dec 01 10:06:55 compute-2 sudo[204513]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:06:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:55.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:06:55 compute-2 ceph-mon[76053]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:06:55 compute-2 ceph-mon[76053]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:06:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:06:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:55 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:55 compute-2 sudo[204671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnexztanrlvofpkwusivitrzuehipoyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583615.6446128-575-213456987586901/AnsiballZ_file.py'
Dec 01 10:06:55 compute-2 sudo[204671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:56 compute-2 python3.9[204673]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:06:56 compute-2 sudo[204671]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:06:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:56 compute-2 sudo[204823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sveofswbfolezidpuvwgwtnolcpcaxxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583616.4450057-602-258821952132412/AnsiballZ_stat.py'
Dec 01 10:06:56 compute-2 sudo[204823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:56 compute-2 python3.9[204825]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:06:56 compute-2 sudo[204823]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:57.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:57.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:57 compute-2 sudo[204977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjlcxqojqivsofplpwngxubtpthfednv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583617.2493603-629-96516669008946/AnsiballZ_stat.py'
Dec 01 10:06:57 compute-2 sudo[204977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:57 compute-2 python3.9[204979]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:06:57 compute-2 sudo[204977]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:58 compute-2 ceph-mon[76053]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:06:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:58 compute-2 sudo[205129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubsebqxpqgmmcbknmglcvdfryqtzesmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583617.940362-653-44185203570061/AnsiballZ_stat.py'
Dec 01 10:06:58 compute-2 sudo[205129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:58 compute-2 python3.9[205131]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:06:58 compute-2 sudo[205129]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:58 compute-2 sudo[205252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pijuymlhlhtlptybeonbysigwqudffvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583617.940362-653-44185203570061/AnsiballZ_copy.py'
Dec 01 10:06:58 compute-2 sudo[205252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:06:58 compute-2 python3.9[205254]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583617.940362-653-44185203570061/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:06:59 compute-2 sudo[205252]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:06:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:59.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:59 compute-2 ceph-mon[76053]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:06:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:06:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:06:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:06:59 compute-2 sudo[205406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhurjgtlgztjjspxitrysluhtptdlfrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583619.24896-698-21473038109694/AnsiballZ_command.py'
Dec 01 10:06:59 compute-2 sudo[205406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:06:59 compute-2 python3.9[205408]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:06:59 compute-2 sudo[205406]: pam_unix(sudo:session): session closed for user root
Dec 01 10:06:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:59 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:00 compute-2 sudo[205559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rszdmimugyqnjtzcguhzccldmmufjlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583619.9795914-722-27251480178735/AnsiballZ_lineinfile.py'
Dec 01 10:07:00 compute-2 sudo[205559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:00 compute-2 python3.9[205561]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:00 compute-2 sudo[205559]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:00 compute-2 sudo[205562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:07:00 compute-2 sudo[205562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:07:00 compute-2 sudo[205562]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:00 compute-2 sudo[205612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:07:00 compute-2 sudo[205612]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:07:00 compute-2 sudo[205612]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:01 compute-2 sudo[205762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrabaprsljdfizwklmyallpurlfysaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583620.6860483-746-16988096758381/AnsiballZ_replace.py'
Dec 01 10:07:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:01 compute-2 sudo[205762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:07:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:07:01 compute-2 ceph-mon[76053]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:07:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:01 compute-2 python3.9[205764]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:01 compute-2 sudo[205762]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:01 compute-2 sudo[205916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-himralpsjdrvvocciguqswosivcvgegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583621.5225394-770-210693092380124/AnsiballZ_replace.py'
Dec 01 10:07:01 compute-2 sudo[205916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:01 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff370000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:01 compute-2 python3.9[205918]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:02 compute-2 sudo[205916]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:02 compute-2 sudo[206068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swoldzraypiidjwgeauhgmoupkvvdwvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583622.3810966-797-219887660231569/AnsiballZ_lineinfile.py'
Dec 01 10:07:02 compute-2 sudo[206068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:02 compute-2 python3.9[206070]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:02 compute-2 sudo[206068]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:03 compute-2 sudo[206221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtgylwrnbujaulwpzrcxfypdcmttsdrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583623.0198293-797-250841974458775/AnsiballZ_lineinfile.py'
Dec 01 10:07:03 compute-2 sudo[206221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:03 compute-2 python3.9[206223]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:03 compute-2 sudo[206221]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:03 compute-2 sudo[206374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajqcbttltdekdkuxeonswjvfaelyvvut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583623.6338089-797-209205591622903/AnsiballZ_lineinfile.py'
Dec 01 10:07:03 compute-2 sudo[206374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:04 compute-2 ceph-mon[76053]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:07:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:04 compute-2 python3.9[206376]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:04 compute-2 sudo[206374]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:04 compute-2 sudo[206526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbolspwcbvawztxijtkiipiofpbhxgrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583624.2455509-797-162977314462726/AnsiballZ_lineinfile.py'
Dec 01 10:07:04 compute-2 sudo[206526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.691 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:07:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:07:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:07:04 compute-2 python3.9[206528]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:04 compute-2 sudo[206526]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:05.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:05.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:05 compute-2 sudo[206680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncnuenmvizgmziejvolnfdrbpsckfrst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583625.4122279-884-106245984583079/AnsiballZ_stat.py'
Dec 01 10:07:05 compute-2 sudo[206680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:05 compute-2 python3.9[206682]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:07:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:05 compute-2 sudo[206680]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:06 compute-2 ceph-mon[76053]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Dec 01 10:07:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:06 compute-2 sudo[206834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbcgxnsxfxuqyzcqqevwqagwcdwvuji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583626.197868-908-152045416947740/AnsiballZ_file.py'
Dec 01 10:07:06 compute-2 sudo[206834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:06 compute-2 python3.9[206836]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:06 compute-2 sudo[206834]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:07 compute-2 ceph-mon[76053]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:07:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:07.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:07 compute-2 sudo[206987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jampkfirpqdlvdppwuuoybkobqpkzeqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583627.0391197-935-182088599313506/AnsiballZ_file.py'
Dec 01 10:07:07 compute-2 sudo[206987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:07 compute-2 python3.9[206990]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:07 compute-2 sudo[206987]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:07.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:08 compute-2 sudo[207140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimglvavicsdyjkfvmwjunjfrddvkdry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583627.8318813-958-264890273114380/AnsiballZ_stat.py'
Dec 01 10:07:08 compute-2 sudo[207140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:08 compute-2 python3.9[207142]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:08 compute-2 sudo[207140]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:08 compute-2 sudo[207228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvoykiscvqxyildiqmwvyjohwkdgoetw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583627.8318813-958-264890273114380/AnsiballZ_file.py'
Dec 01 10:07:08 compute-2 sudo[207228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:08 compute-2 podman[207192]: 2025-12-01 10:07:08.61745773 +0000 UTC m=+0.087501668 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:07:08 compute-2 python3.9[207239]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:08 compute-2 sudo[207228]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:09 compute-2 sudo[207392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qemciaojhhgogackgoaxvqtegtjrtmut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583628.9217076-958-51833665881686/AnsiballZ_stat.py'
Dec 01 10:07:09 compute-2 sudo[207392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:09.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:09 compute-2 python3.9[207394]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:09 compute-2 sudo[207392]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:09.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:09 compute-2 sudo[207471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujfadvfmyawxearffoubtplrhhkjbjyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583628.9217076-958-51833665881686/AnsiballZ_file.py'
Dec 01 10:07:09 compute-2 sudo[207471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:09 compute-2 python3.9[207473]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:09 compute-2 sudo[207471]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:10 compute-2 ceph-mon[76053]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:07:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:10 compute-2 sudo[207623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rapnfrjsatsrdgrulnvyrqkvgofwreir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583630.3373554-1028-214423340308040/AnsiballZ_file.py'
Dec 01 10:07:10 compute-2 sudo[207623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:10 compute-2 python3.9[207625]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:10 compute-2 sudo[207623]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:11 compute-2 ceph-mon[76053]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:07:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:11 compute-2 sudo[207776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byrhjhkcycmnwguuhuigpcoxrcqagicr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583631.0428522-1052-127712071046060/AnsiballZ_stat.py'
Dec 01 10:07:11 compute-2 sudo[207776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:11 compute-2 python3.9[207779]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:11 compute-2 sudo[207776]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:11.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:11 compute-2 sudo[207855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyhdiarndcedtsoeigmsruomslptnpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583631.0428522-1052-127712071046060/AnsiballZ_file.py'
Dec 01 10:07:11 compute-2 sudo[207855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:11 compute-2 python3.9[207857]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:11 compute-2 sudo[207855]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:12 compute-2 sudo[208007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrhlqgaehzcxzqzkdfyyqcaovtunqqud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583632.2154894-1088-147040857429103/AnsiballZ_stat.py'
Dec 01 10:07:12 compute-2 sudo[208007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:12 compute-2 python3.9[208009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:12 compute-2 sudo[208007]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:12 compute-2 sudo[208087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uonbyrpcflgsyudklloxkonmywegsjzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583632.2154894-1088-147040857429103/AnsiballZ_file.py'
Dec 01 10:07:12 compute-2 sudo[208087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:13 compute-2 python3.9[208089]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:13 compute-2 sudo[208087]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:13.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:13.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:13 compute-2 sudo[208241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlofaskxhydgjnzorscvzwwnqhwvmznu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583633.4761322-1124-6397547987645/AnsiballZ_systemd.py'
Dec 01 10:07:13 compute-2 sudo[208241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:14 compute-2 ceph-mon[76053]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:14 compute-2 python3.9[208243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:07:14 compute-2 systemd[1]: Reloading.
Dec 01 10:07:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:14 compute-2 systemd-sysv-generator[208275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:14 compute-2 systemd-rc-local-generator[208270]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:14 compute-2 sudo[208241]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.852987) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634853142, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1157, "num_deletes": 256, "total_data_size": 2780945, "memory_usage": 2824128, "flush_reason": "Manual Compaction"}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634865055, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1786618, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18220, "largest_seqno": 19371, "table_properties": {"data_size": 1781656, "index_size": 2486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10397, "raw_average_key_size": 18, "raw_value_size": 1771576, "raw_average_value_size": 3157, "num_data_blocks": 111, "num_entries": 561, "num_filter_entries": 561, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583543, "oldest_key_time": 1764583543, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 12502 microseconds, and 5610 cpu microseconds.
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.865513) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1786618 bytes OK
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.865681) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867364) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867382) EVENT_LOG_v1 {"time_micros": 1764583634867378, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867407) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2775350, prev total WAL file size 2775350, number of live WAL files 2.
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.868689) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1744KB)], [33(11MB)]
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634868784, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13878104, "oldest_snapshot_seqno": -1}
Dec 01 10:07:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4906 keys, 13411130 bytes, temperature: kUnknown
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634958087, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13411130, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13377141, "index_size": 20631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12293, "raw_key_size": 125107, "raw_average_key_size": 25, "raw_value_size": 13286715, "raw_average_value_size": 2708, "num_data_blocks": 846, "num_entries": 4906, "num_filter_entries": 4906, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.958960) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13411130 bytes
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.960554) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.2 rd, 149.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 11.5 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(15.3) write-amplify(7.5) OK, records in: 5432, records dropped: 526 output_compression: NoCompression
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.960579) EVENT_LOG_v1 {"time_micros": 1764583634960568, "job": 18, "event": "compaction_finished", "compaction_time_micros": 90000, "compaction_time_cpu_micros": 39732, "output_level": 6, "num_output_files": 1, "total_output_size": 13411130, "num_input_records": 5432, "num_output_records": 4906, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634961049, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634963468, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.868574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:07:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:15 compute-2 sudo[208431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rklvxxugogpiwruedbxuwvqhwcrgsgby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583634.8637037-1147-82374766614846/AnsiballZ_stat.py'
Dec 01 10:07:15 compute-2 sudo[208431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:15.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:15 compute-2 python3.9[208433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:15 compute-2 sudo[208431]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:15.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:15 compute-2 sudo[208510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjozyihmwyfvksbcuckfkkjafcneaojb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583634.8637037-1147-82374766614846/AnsiballZ_file.py'
Dec 01 10:07:15 compute-2 sudo[208510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:15 compute-2 ceph-mon[76053]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:15 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff370002c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:15 compute-2 python3.9[208512]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:15 compute-2 sudo[208510]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:16 compute-2 sudo[208662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxkiwmqkiacqiyfazbzpzaumdkfpzzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583636.1562986-1184-280376253593421/AnsiballZ_stat.py'
Dec 01 10:07:16 compute-2 sudo[208662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:16 compute-2 python3.9[208664]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:16 compute-2 sudo[208662]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:16 compute-2 sudo[208740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsmdljhspfaknkrxgpmyhcnwvmxryfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583636.1562986-1184-280376253593421/AnsiballZ_file.py'
Dec 01 10:07:16 compute-2 sudo[208740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:17 compute-2 python3.9[208742]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:17 compute-2 sudo[208740]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:17.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:17.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:17 compute-2 sudo[208894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zljgqtgvlidbiwxkccoujdhhcugxkqmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583637.464074-1220-101040298989134/AnsiballZ_systemd.py'
Dec 01 10:07:17 compute-2 sudo[208894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:18 compute-2 ceph-mon[76053]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:07:18 compute-2 python3.9[208896]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:07:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:18 compute-2 systemd[1]: Reloading.
Dec 01 10:07:18 compute-2 systemd-rc-local-generator[208923]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:18 compute-2 systemd-sysv-generator[208926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:18 compute-2 systemd[1]: Starting Create netns directory...
Dec 01 10:07:18 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 01 10:07:18 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 01 10:07:18 compute-2 systemd[1]: Finished Create netns directory.
Dec 01 10:07:18 compute-2 sudo[208894]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:19.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:19 compute-2 sudo[209088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azbyfrzqmsympkrfsgdzfdubcsmvtcmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583639.1190279-1250-234169260047788/AnsiballZ_file.py'
Dec 01 10:07:19 compute-2 sudo[209088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:19.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:19 compute-2 python3.9[209090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:19 compute-2 sudo[209088]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:20 compute-2 ceph-mon[76053]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:20 compute-2 sudo[209253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvmitpvozjfftfivlgmogvudgdnicrmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583640.1547403-1275-244077827002792/AnsiballZ_stat.py'
Dec 01 10:07:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:20 compute-2 sudo[209253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:20 compute-2 podman[209214]: 2025-12-01 10:07:20.614858979 +0000 UTC m=+0.134769829 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 01 10:07:20 compute-2 sudo[209266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:07:20 compute-2 sudo[209266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:07:20 compute-2 sudo[209266]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:20 compute-2 python3.9[209261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:20 compute-2 sudo[209253]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:21 compute-2 sudo[209414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apekxtjktvizehzhtkbbumcbwvvvijwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583640.1547403-1275-244077827002792/AnsiballZ_copy.py'
Dec 01 10:07:21 compute-2 sudo[209414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:21 compute-2 ceph-mon[76053]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:07:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.004000096s ======
Dec 01 10:07:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:21.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000096s
Dec 01 10:07:21 compute-2 python3.9[209416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583640.1547403-1275-244077827002792/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:21 compute-2 sudo[209414]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:21.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:22 compute-2 sudo[209568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjrrixelhtinonpaoflfyumjyasmxel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583641.9496098-1325-123698439060567/AnsiballZ_file.py'
Dec 01 10:07:22 compute-2 sudo[209568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:22 compute-2 python3.9[209570]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:07:22 compute-2 sudo[209568]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:22 compute-2 sudo[209720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vekqrgudotsviitkcbccgqnbrmvhqqur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583642.7223694-1348-124806485918067/AnsiballZ_stat.py'
Dec 01 10:07:22 compute-2 sudo[209720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:23 compute-2 python3.9[209722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:23 compute-2 sudo[209720]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:23.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:23 compute-2 sudo[209845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yklpzqblukocjjewkrlfixsmvnnbwlgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583642.7223694-1348-124806485918067/AnsiballZ_copy.py'
Dec 01 10:07:23 compute-2 sudo[209845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:23.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:23 compute-2 ceph-mon[76053]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:23 compute-2 python3.9[209847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583642.7223694-1348-124806485918067/.source.json _original_basename=.vx5mixsr follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:23 compute-2 sudo[209845]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100723 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:07:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:23 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:07:24 compute-2 sudo[209997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldtursbppdugegjrepwgazwjopkgoppm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583644.60792-1394-29244927288882/AnsiballZ_file.py'
Dec 01 10:07:24 compute-2 sudo[209997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:25 compute-2 python3.9[209999]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:25 compute-2 sudo[209997]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:25.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:25 compute-2 ceph-mon[76053]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:07:25 compute-2 sudo[210151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okmcoyxcfjsdnrmdyanqgqzlnncunqwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583645.515094-1417-194038246169863/AnsiballZ_stat.py'
Dec 01 10:07:25 compute-2 sudo[210151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:25 compute-2 sudo[210151]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:26 compute-2 sudo[210274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldopdjnxajwjlvxppjyhorbeomjmczzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583645.515094-1417-194038246169863/AnsiballZ_copy.py'
Dec 01 10:07:26 compute-2 sudo[210274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:26 compute-2 sudo[210274]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:27.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:27 compute-2 sudo[210428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sifanxkxymwfnrycigjtxesvxgnnzznt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583647.0869255-1469-9505653644655/AnsiballZ_container_config_data.py'
Dec 01 10:07:27 compute-2 sudo[210428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:27.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:27 compute-2 python3.9[210430]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 01 10:07:27 compute-2 sudo[210428]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:28 compute-2 ceph-mon[76053]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:07:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:28 compute-2 sudo[210580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgylhbutgyzbsfpibrnkgnlqaimygfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583648.090258-1496-173722977190432/AnsiballZ_container_config_hash.py'
Dec 01 10:07:28 compute-2 sudo[210580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:28 compute-2 python3.9[210582]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 10:07:28 compute-2 sudo[210580]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:07:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:29.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:07:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:29.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:29 compute-2 sudo[210734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdsdjhnvyfxacaroeoxjvjtgtwshkwed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583649.1660457-1522-29933670031691/AnsiballZ_podman_container_info.py'
Dec 01 10:07:29 compute-2 sudo[210734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:29 compute-2 python3.9[210736]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 01 10:07:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:29 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:30 compute-2 sudo[210734]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:30 compute-2 ceph-mon[76053]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:07:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:31.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:31 compute-2 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 01 10:07:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:31.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:31 compute-2 sudo[210916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttintihvgkrraepgqolwvcqdbfxmczbc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583651.3702672-1561-48880643730253/AnsiballZ_edpm_container_manage.py'
Dec 01 10:07:31 compute-2 sudo[210916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:31 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:32 compute-2 ceph-mon[76053]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:07:32 compute-2 python3[210918]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 10:07:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:32 compute-2 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 01 10:07:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:33 compute-2 ceph-mon[76053]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:07:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:33.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:33.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:07:33 compute-2 podman[210931]: 2025-12-01 10:07:33.825569737 +0000 UTC m=+1.643480597 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 10:07:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:34 compute-2 podman[210989]: 2025-12-01 10:07:33.94840904 +0000 UTC m=+0.021564955 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 10:07:34 compute-2 podman[210989]: 2025-12-01 10:07:34.0856484 +0000 UTC m=+0.158804295 container create 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 10:07:34 compute-2 python3[210918]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 01 10:07:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:34 compute-2 sudo[210916]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:34 compute-2 sudo[211174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frqtjfkbcfppuzpmwiaswsmpcnbkljux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583654.5680668-1586-53325622993047/AnsiballZ_stat.py'
Dec 01 10:07:34 compute-2 sudo[211174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:35 compute-2 python3.9[211176]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:07:35 compute-2 sudo[211174]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:35.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:35 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:35 compute-2 sudo[211330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blmmotstwpvcnrofzcxuyclpnzmxcgap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583655.6753087-1612-12523261226441/AnsiballZ_file.py'
Dec 01 10:07:35 compute-2 sudo[211330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:36 compute-2 ceph-mon[76053]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:36 compute-2 python3.9[211332]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:36 compute-2 sudo[211330]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:36 compute-2 sudo[211406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czdoxcyqwwtqfohamsiuelvkmfasvdcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583655.6753087-1612-12523261226441/AnsiballZ_stat.py'
Dec 01 10:07:36 compute-2 sudo[211406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:36 compute-2 python3.9[211408]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:07:36 compute-2 sudo[211406]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100736 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:37 compute-2 sudo[211557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crxnsunhixqfdckzwlksmdlyktyvszju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583656.6539123-1612-75142857614708/AnsiballZ_copy.py'
Dec 01 10:07:37 compute-2 sudo[211557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:37 compute-2 python3.9[211559]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583656.6539123-1612-75142857614708/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:37 compute-2 sudo[211557]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:37.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:37 compute-2 sudo[211635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojwvathzzvpbkgxtgnvlhakahgnjerbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583656.6539123-1612-75142857614708/AnsiballZ_systemd.py'
Dec 01 10:07:37 compute-2 sudo[211635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:37 compute-2 python3.9[211637]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:07:37 compute-2 systemd[1]: Reloading.
Dec 01 10:07:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:37 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:38 compute-2 systemd-rc-local-generator[211665]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:38 compute-2 systemd-sysv-generator[211668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:38 compute-2 ceph-mon[76053]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:07:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:38 compute-2 sudo[211635]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:38 compute-2 sudo[211746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbpdaokvmcgyfaruzziucgvcbdnroki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583656.6539123-1612-75142857614708/AnsiballZ_systemd.py'
Dec 01 10:07:38 compute-2 sudo[211746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:38 compute-2 python3.9[211748]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:07:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:38 compute-2 systemd[1]: Reloading.
Dec 01 10:07:39 compute-2 podman[211750]: 2025-12-01 10:07:38.999938527 +0000 UTC m=+0.066125274 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 01 10:07:39 compute-2 systemd-rc-local-generator[211797]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:39 compute-2 systemd-sysv-generator[211801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:39 compute-2 ceph-mon[76053]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:07:39 compute-2 systemd[1]: Starting multipathd container...
Dec 01 10:07:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:39.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:39 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:07:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 10:07:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 10:07:39 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec 01 10:07:39 compute-2 podman[211809]: 2025-12-01 10:07:39.457227667 +0000 UTC m=+0.117412163 container init 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:07:39 compute-2 multipathd[211824]: + sudo -E kolla_set_configs
Dec 01 10:07:39 compute-2 podman[211809]: 2025-12-01 10:07:39.480955488 +0000 UTC m=+0.141139984 container start 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:07:39 compute-2 sudo[211830]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 01 10:07:39 compute-2 sudo[211830]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 10:07:39 compute-2 sudo[211830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 10:07:39 compute-2 podman[211809]: multipathd
Dec 01 10:07:39 compute-2 systemd[1]: Started multipathd container.
Dec 01 10:07:39 compute-2 multipathd[211824]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 10:07:39 compute-2 multipathd[211824]: INFO:__main__:Validating config file
Dec 01 10:07:39 compute-2 multipathd[211824]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 10:07:39 compute-2 multipathd[211824]: INFO:__main__:Writing out command to execute
Dec 01 10:07:39 compute-2 sudo[211746]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:39 compute-2 sudo[211830]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:39 compute-2 multipathd[211824]: ++ cat /run_command
Dec 01 10:07:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:39 compute-2 podman[211831]: 2025-12-01 10:07:39.557869048 +0000 UTC m=+0.065929688 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:07:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000051s ======
Dec 01 10:07:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:39.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec 01 10:07:39 compute-2 multipathd[211824]: + CMD='/usr/sbin/multipathd -d'
Dec 01 10:07:39 compute-2 multipathd[211824]: + ARGS=
Dec 01 10:07:39 compute-2 multipathd[211824]: + sudo kolla_copy_cacerts
Dec 01 10:07:39 compute-2 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 10:07:39 compute-2 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.service: Failed with result 'exit-code'.
Dec 01 10:07:39 compute-2 sudo[211852]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 01 10:07:39 compute-2 sudo[211852]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 10:07:39 compute-2 sudo[211852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 10:07:39 compute-2 sudo[211852]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:39 compute-2 multipathd[211824]: + [[ ! -n '' ]]
Dec 01 10:07:39 compute-2 multipathd[211824]: + . kolla_extend_start
Dec 01 10:07:39 compute-2 multipathd[211824]: Running command: '/usr/sbin/multipathd -d'
Dec 01 10:07:39 compute-2 multipathd[211824]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 01 10:07:39 compute-2 multipathd[211824]: + umask 0022
Dec 01 10:07:39 compute-2 multipathd[211824]: + exec /usr/sbin/multipathd -d
Dec 01 10:07:39 compute-2 multipathd[211824]: 3521.483360 | --------start up--------
Dec 01 10:07:39 compute-2 multipathd[211824]: 3521.483380 | read /etc/multipath.conf
Dec 01 10:07:39 compute-2 multipathd[211824]: 3521.489693 | path checkers start up
Dec 01 10:07:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:39 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:07:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:40 compute-2 sudo[212012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:07:40 compute-2 sudo[212012]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:07:40 compute-2 sudo[212012]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:40 compute-2 python3.9[212011]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:07:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:41 compute-2 ceph-mon[76053]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:07:41 compute-2 sudo[212190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwdarbxuimukrartnnwbhihgeehufoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583661.0398543-1721-79669348403115/AnsiballZ_command.py'
Dec 01 10:07:41 compute-2 sudo[212190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:41.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:41 compute-2 python3.9[212192]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:07:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:41.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:41 compute-2 sudo[212190]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:42 compute-2 sudo[212355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvwlkoglgvvessdqzyoqxugfenffgsrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583661.8651116-1744-211286756860424/AnsiballZ_systemd.py'
Dec 01 10:07:42 compute-2 sudo[212355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:42 compute-2 python3.9[212357]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:07:42 compute-2 systemd[1]: Stopping multipathd container...
Dec 01 10:07:42 compute-2 multipathd[211824]: 3524.431054 | exit (signal)
Dec 01 10:07:42 compute-2 multipathd[211824]: 3524.431138 | --------shut down-------
Dec 01 10:07:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:42 compute-2 systemd[1]: libpod-212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.scope: Deactivated successfully.
Dec 01 10:07:42 compute-2 podman[212361]: 2025-12-01 10:07:42.576961191 +0000 UTC m=+0.066031551 container died 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec 01 10:07:42 compute-2 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.timer: Deactivated successfully.
Dec 01 10:07:42 compute-2 systemd[1]: Stopped /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec 01 10:07:42 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-userdata-shm.mount: Deactivated successfully.
Dec 01 10:07:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af-merged.mount: Deactivated successfully.
Dec 01 10:07:42 compute-2 podman[212361]: 2025-12-01 10:07:42.820826708 +0000 UTC m=+0.309897038 container cleanup 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:07:42 compute-2 podman[212361]: multipathd
Dec 01 10:07:42 compute-2 podman[212390]: multipathd
Dec 01 10:07:42 compute-2 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 01 10:07:42 compute-2 systemd[1]: Stopped multipathd container.
Dec 01 10:07:42 compute-2 systemd[1]: Starting multipathd container...
Dec 01 10:07:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003260 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:42 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:07:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 10:07:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 10:07:43 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec 01 10:07:43 compute-2 podman[212403]: 2025-12-01 10:07:43.029692584 +0000 UTC m=+0.116476709 container init 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 10:07:43 compute-2 multipathd[212419]: + sudo -E kolla_set_configs
Dec 01 10:07:43 compute-2 sudo[212425]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 01 10:07:43 compute-2 sudo[212425]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 10:07:43 compute-2 sudo[212425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 10:07:43 compute-2 podman[212403]: 2025-12-01 10:07:43.064162301 +0000 UTC m=+0.150946406 container start 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:07:43 compute-2 podman[212403]: multipathd
Dec 01 10:07:43 compute-2 systemd[1]: Started multipathd container.
Dec 01 10:07:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:43 compute-2 multipathd[212419]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 10:07:43 compute-2 multipathd[212419]: INFO:__main__:Validating config file
Dec 01 10:07:43 compute-2 multipathd[212419]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 10:07:43 compute-2 multipathd[212419]: INFO:__main__:Writing out command to execute
Dec 01 10:07:43 compute-2 sudo[212355]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:43 compute-2 sudo[212425]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:43 compute-2 multipathd[212419]: ++ cat /run_command
Dec 01 10:07:43 compute-2 multipathd[212419]: + CMD='/usr/sbin/multipathd -d'
Dec 01 10:07:43 compute-2 multipathd[212419]: + ARGS=
Dec 01 10:07:43 compute-2 multipathd[212419]: + sudo kolla_copy_cacerts
Dec 01 10:07:43 compute-2 sudo[212449]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 01 10:07:43 compute-2 sudo[212449]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 01 10:07:43 compute-2 sudo[212449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 01 10:07:43 compute-2 podman[212426]: 2025-12-01 10:07:43.140316872 +0000 UTC m=+0.063746722 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Dec 01 10:07:43 compute-2 sudo[212449]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:43 compute-2 multipathd[212419]: + [[ ! -n '' ]]
Dec 01 10:07:43 compute-2 multipathd[212419]: + . kolla_extend_start
Dec 01 10:07:43 compute-2 multipathd[212419]: Running command: '/usr/sbin/multipathd -d'
Dec 01 10:07:43 compute-2 multipathd[212419]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 01 10:07:43 compute-2 multipathd[212419]: + umask 0022
Dec 01 10:07:43 compute-2 multipathd[212419]: + exec /usr/sbin/multipathd -d
Dec 01 10:07:43 compute-2 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-4abee0c55f24cdf9.service: Main process exited, code=exited, status=1/FAILURE
Dec 01 10:07:43 compute-2 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-4abee0c55f24cdf9.service: Failed with result 'exit-code'.
Dec 01 10:07:43 compute-2 multipathd[212419]: 3525.046215 | --------start up--------
Dec 01 10:07:43 compute-2 multipathd[212419]: 3525.046235 | read /etc/multipath.conf
Dec 01 10:07:43 compute-2 multipathd[212419]: 3525.051915 | path checkers start up
Dec 01 10:07:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:43.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:43.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:43 compute-2 sudo[212611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgdlttggklnnljmyooceurrabzdoscqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583663.408408-1769-109267911414507/AnsiballZ_file.py'
Dec 01 10:07:43 compute-2 sudo[212611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:43 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 01 10:07:43 compute-2 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 01 10:07:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:43 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:43 compute-2 python3.9[212613]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:43 compute-2 sudo[212611]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:44 compute-2 ceph-mon[76053]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:07:44 compute-2 sudo[212766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bczihwznrenkyfweqsvhmvzgksrbndlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583664.548797-1805-279518089205667/AnsiballZ_file.py'
Dec 01 10:07:44 compute-2 sudo[212766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:45 compute-2 python3.9[212768]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 01 10:07:45 compute-2 sudo[212766]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:45.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:45 compute-2 sudo[212920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cswzrksyhsrnyvdrkcihtsypulixfmfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583665.2613432-1828-132091318647946/AnsiballZ_modprobe.py'
Dec 01 10:07:45 compute-2 sudo[212920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:45 compute-2 python3.9[212922]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 01 10:07:45 compute-2 kernel: Key type psk registered
Dec 01 10:07:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:45 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003260 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:45 compute-2 sudo[212920]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:46 compute-2 ceph-mon[76053]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Dec 01 10:07:46 compute-2 sudo[213083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyvjcrwyezbpswkcqgrzeshkeagnjszk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583666.2056968-1853-123705970755541/AnsiballZ_stat.py'
Dec 01 10:07:46 compute-2 sudo[213083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:46 compute-2 python3.9[213085]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:07:46 compute-2 sudo[213083]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:46 compute-2 sudo[213206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrzbiuoycfgqxcdrjeegnendvdrettqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583666.2056968-1853-123705970755541/AnsiballZ_copy.py'
Dec 01 10:07:46 compute-2 sudo[213206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:47 compute-2 python3.9[213208]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583666.2056968-1853-123705970755541/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:47 compute-2 sudo[213206]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:47.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:47 compute-2 ceph-mon[76053]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:07:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:07:47 compute-2 sudo[213360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwptizrdmhksguwtamszyobxjbszszjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583667.6348696-1901-182706451508309/AnsiballZ_lineinfile.py'
Dec 01 10:07:47 compute-2 sudo[213360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:48 compute-2 python3.9[213362]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:48 compute-2 sudo[213360]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:48 compute-2 sudo[213512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqpuludiaekhdzpqvajdmndllgxboxht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583668.398541-1924-219635898795951/AnsiballZ_systemd.py'
Dec 01 10:07:48 compute-2 sudo[213512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:48 compute-2 python3.9[213514]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:07:49 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 01 10:07:49 compute-2 systemd[1]: Stopped Load Kernel Modules.
Dec 01 10:07:49 compute-2 systemd[1]: Stopping Load Kernel Modules...
Dec 01 10:07:49 compute-2 systemd[1]: Starting Load Kernel Modules...
Dec 01 10:07:49 compute-2 systemd[1]: Finished Load Kernel Modules.
Dec 01 10:07:49 compute-2 sudo[213512]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:49.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:49 compute-2 ceph-mon[76053]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:07:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:49.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:49 compute-2 sudo[213670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhywismefwtywyubfynigzvvgxyllzmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583669.5755775-1949-105832339589513/AnsiballZ_dnf.py'
Dec 01 10:07:49 compute-2 sudo[213670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100749 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:07:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:49 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:50 compute-2 python3.9[213672]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 01 10:07:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:07:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:51.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:51 compute-2 podman[213676]: 2025-12-01 10:07:51.445798428 +0000 UTC m=+0.098655971 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 01 10:07:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:52 compute-2 ceph-mon[76053]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Dec 01 10:07:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:52 compute-2 systemd[1]: Reloading.
Dec 01 10:07:52 compute-2 systemd-rc-local-generator[213733]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:52 compute-2 systemd-sysv-generator[213737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:52 compute-2 systemd[1]: Reloading.
Dec 01 10:07:53 compute-2 systemd-sysv-generator[213770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:53 compute-2 systemd-rc-local-generator[213766]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:53 compute-2 ceph-mon[76053]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Dec 01 10:07:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:53 compute-2 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 01 10:07:53 compute-2 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 01 10:07:53 compute-2 lvm[213813]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 10:07:53 compute-2 lvm[213813]: VG ceph_vg0 finished
Dec 01 10:07:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:53.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:53 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 01 10:07:53 compute-2 systemd[1]: Starting man-db-cache-update.service...
Dec 01 10:07:53 compute-2 systemd[1]: Reloading.
Dec 01 10:07:53 compute-2 systemd-rc-local-generator[213866]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:53 compute-2 systemd-sysv-generator[213869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:53 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:54 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 01 10:07:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:07:54 compute-2 sudo[213670]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:55 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 01 10:07:55 compute-2 systemd[1]: Finished man-db-cache-update.service.
Dec 01 10:07:55 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.699s CPU time.
Dec 01 10:07:55 compute-2 systemd[1]: run-r3c614ad62f594238b3a322d8c51e1139.service: Deactivated successfully.
Dec 01 10:07:55 compute-2 sudo[215154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqvpvuzxeszhpyznpvjqlbqupjvwnlka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583675.0052416-1974-155576180956485/AnsiballZ_systemd_service.py'
Dec 01 10:07:55 compute-2 sudo[215154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:55.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:55.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:55 compute-2 python3.9[215157]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:07:55 compute-2 systemd[1]: Stopping Open-iSCSI...
Dec 01 10:07:55 compute-2 iscsid[203209]: iscsid shutting down.
Dec 01 10:07:55 compute-2 systemd[1]: iscsid.service: Deactivated successfully.
Dec 01 10:07:55 compute-2 systemd[1]: Stopped Open-iSCSI.
Dec 01 10:07:55 compute-2 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 01 10:07:55 compute-2 systemd[1]: Starting Open-iSCSI...
Dec 01 10:07:55 compute-2 systemd[1]: Started Open-iSCSI.
Dec 01 10:07:55 compute-2 sudo[215154]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:55 compute-2 ceph-mon[76053]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Dec 01 10:07:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:55 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:07:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100756 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:07:56 compute-2 python3.9[215312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 01 10:07:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:57.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:07:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:57.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:07:57 compute-2 sudo[215468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gewslmsqliexolcmlfoowqrveykiaekl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583677.363154-2025-195857147340754/AnsiballZ_file.py'
Dec 01 10:07:57 compute-2 sudo[215468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:57 compute-2 python3.9[215470]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:07:57 compute-2 sudo[215468]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:58 compute-2 ceph-mon[76053]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 01 10:07:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:58 compute-2 sudo[215620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zozfksfxfcknlokpavgnozyngwareslt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583678.3580294-2058-128943247709979/AnsiballZ_systemd_service.py'
Dec 01 10:07:58 compute-2 sudo[215620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:07:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:07:58 compute-2 python3.9[215622]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:07:58 compute-2 systemd[1]: Reloading.
Dec 01 10:07:59 compute-2 systemd-sysv-generator[215652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:07:59 compute-2 systemd-rc-local-generator[215649]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:07:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:07:59 compute-2 ceph-mon[76053]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 596 B/s wr, 2 op/s
Dec 01 10:07:59 compute-2 sudo[215620]: pam_unix(sudo:session): session closed for user root
Dec 01 10:07:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:59.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:07:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:07:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:59.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:07:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:59 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:00 compute-2 python3.9[215809]: ansible-ansible.builtin.service_facts Invoked
Dec 01 10:08:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:00 compute-2 network[215827]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 01 10:08:00 compute-2 network[215828]: 'network-scripts' will be removed from distribution in near future.
Dec 01 10:08:00 compute-2 network[215829]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 01 10:08:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:01 compute-2 sudo[215835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:08:01 compute-2 sudo[215836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:08:01 compute-2 sudo[215835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:01 compute-2 sudo[215836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:01 compute-2 sudo[215836]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:01 compute-2 sudo[215835]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:01 compute-2 sudo[215886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:08:01 compute-2 sudo[215886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:01.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:01.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:01 compute-2 sudo[215886]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:01 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:02 compute-2 ceph-mon[76053]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 597 B/s wr, 2 op/s
Dec 01 10:08:02 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:02 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:02 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:02 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:08:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:08:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:03.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:03.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:04 compute-2 ceph-mon[76053]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 637 B/s rd, 182 B/s wr, 0 op/s
Dec 01 10:08:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:08:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.696 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:08:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.696 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:08:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:05 compute-2 ceph-mon[76053]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 637 B/s rd, 182 B/s wr, 0 op/s
Dec 01 10:08:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:05.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:05.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:06 compute-2 sudo[216213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycapajabeukqlyszqzoldrkyqnprvwme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583685.738737-2115-250145542063480/AnsiballZ_systemd_service.py'
Dec 01 10:08:06 compute-2 sudo[216213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:06 compute-2 python3.9[216215]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:06 compute-2 sudo[216213]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:06 compute-2 sudo[216366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msgbkbvcyergkrashxadinwcrtgemunx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583686.4893203-2115-77475768106743/AnsiballZ_systemd_service.py'
Dec 01 10:08:06 compute-2 sudo[216366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:07 compute-2 python3.9[216368]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:07 compute-2 sudo[216366]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:07 compute-2 ceph-mon[76053]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 820 B/s rd, 182 B/s wr, 1 op/s
Dec 01 10:08:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:07.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:07 compute-2 sudo[216521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkiskapkdewkzixtqzgmyjbkhnqfhuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583687.2771091-2115-204101957236923/AnsiballZ_systemd_service.py'
Dec 01 10:08:07 compute-2 sudo[216521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:07.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:07 compute-2 python3.9[216523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:07 compute-2 sudo[216521]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:07 compute-2 sudo[216525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:08:07 compute-2 sudo[216525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:07 compute-2 sudo[216525]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:08 compute-2 sudo[216699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ropksqikjvmpqtfrlpgysvzitrwrqran ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583688.075372-2115-157902066384734/AnsiballZ_systemd_service.py'
Dec 01 10:08:08 compute-2 sudo[216699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:08 compute-2 python3.9[216701]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:08 compute-2 sudo[216699]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:08 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:08 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:08:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:09 compute-2 sudo[216852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsoqnommfimgreachdlsmfgrwemcwesh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583688.8239863-2115-104254144436241/AnsiballZ_systemd_service.py'
Dec 01 10:08:09 compute-2 sudo[216852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:09.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:09 compute-2 podman[216857]: 2025-12-01 10:08:09.409576453 +0000 UTC m=+0.061228088 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 01 10:08:09 compute-2 python3.9[216854]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:09 compute-2 sudo[216852]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:09.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:09 compute-2 ceph-mon[76053]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 182 B/s rd, 0 op/s
Dec 01 10:08:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:08:09 compute-2 sudo[217027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvifsbfkapzpmdppjzhypzrokoiahyqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583689.6032114-2115-230487456340531/AnsiballZ_systemd_service.py'
Dec 01 10:08:09 compute-2 sudo[217027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100809 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:08:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:10 compute-2 python3.9[217029]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:10 compute-2 sudo[217027]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:10 compute-2 sudo[217180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpkkbmxoyggjouafbzgdzqvrheqlqxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583690.3684483-2115-64181500965359/AnsiballZ_systemd_service.py'
Dec 01 10:08:10 compute-2 sudo[217180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:10 compute-2 python3.9[217182]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:11 compute-2 sudo[217180]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:11.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:11 compute-2 sudo[217335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkelvmdgrdkouknekgsxhnuuxpbaixzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583691.153119-2115-194481655968815/AnsiballZ_systemd_service.py'
Dec 01 10:08:11 compute-2 sudo[217335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:11 compute-2 python3.9[217337]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:08:11 compute-2 ceph-mon[76053]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 182 B/s rd, 0 op/s
Dec 01 10:08:11 compute-2 sudo[217335]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4003660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:13.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:13 compute-2 podman[217365]: 2025-12-01 10:08:13.401023153 +0000 UTC m=+0.056057254 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 01 10:08:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:13.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:13 compute-2 ceph-mon[76053]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 182 B/s rd, 0 op/s
Dec 01 10:08:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:14 compute-2 sudo[217512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbfzjtfqeqkgrqzcwzxffzjmumxhmlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583693.984077-2292-271633301236529/AnsiballZ_file.py'
Dec 01 10:08:14 compute-2 sudo[217512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:14 compute-2 python3.9[217514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:14 compute-2 sudo[217512]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:14 compute-2 sudo[217664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryezvhqatmufldypbcejtztgtmghwlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583694.5836036-2292-210097988447658/AnsiballZ_file.py'
Dec 01 10:08:14 compute-2 sudo[217664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:15 compute-2 python3.9[217666]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:15 compute-2 sudo[217664]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:15.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:15 compute-2 sudo[217818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kowvypunioplndkaodexkbwhhgxjoneg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583695.1830301-2292-225478276690445/AnsiballZ_file.py'
Dec 01 10:08:15 compute-2 sudo[217818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:15.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:15 compute-2 python3.9[217820]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:15 compute-2 sudo[217818]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:15 compute-2 ceph-mon[76053]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:08:16 compute-2 sudo[217970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcrdjnmoumzojuarbcinjcswrbdrxmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583695.755332-2292-89529784156497/AnsiballZ_file.py'
Dec 01 10:08:16 compute-2 sudo[217970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4003660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:16 compute-2 python3.9[217972]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:16 compute-2 sudo[217970]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:16 compute-2 sudo[218122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiqoecuwymurydpikbjlcrwpwwtgknnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583696.3576317-2292-138089508618100/AnsiballZ_file.py'
Dec 01 10:08:16 compute-2 sudo[218122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100816 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:08:16 compute-2 python3.9[218124]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:16 compute-2 sudo[218122]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:17 compute-2 sudo[218275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyczjgamhsogxcgtkylhemjozooxkqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583696.9645274-2292-165687287201614/AnsiballZ_file.py'
Dec 01 10:08:17 compute-2 sudo[218275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:17.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:17 compute-2 python3.9[218277]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:17 compute-2 sudo[218275]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:17.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:17 compute-2 sudo[218428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zktfhpxkpnjuygssezjzfrzsadtlxwau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583697.6070614-2292-184585088652333/AnsiballZ_file.py'
Dec 01 10:08:17 compute-2 sudo[218428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:08:17 compute-2 ceph-mon[76053]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:08:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:18 compute-2 python3.9[218430]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:18 compute-2 sudo[218428]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:18 compute-2 sudo[218580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhfyiuvkjeqalzyzajnjsvkvtseeitz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583698.183191-2292-91303774627407/AnsiballZ_file.py'
Dec 01 10:08:18 compute-2 sudo[218580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:18 compute-2 python3.9[218582]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:18 compute-2 sudo[218580]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:19.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:19 compute-2 ceph-mon[76053]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:20 compute-2 sudo[218734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmktfzgwnpdnbbagmeypgslofkdglvdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583700.2709706-2463-178410556824919/AnsiballZ_file.py'
Dec 01 10:08:20 compute-2 sudo[218734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:20 compute-2 python3.9[218736]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:20 compute-2 sudo[218734]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:08:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:21 compute-2 sudo[218836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:08:21 compute-2 sudo[218836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:21 compute-2 sudo[218836]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:21 compute-2 sudo[218912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjpyjjgwjomvbfqogqaizmvazdzvukek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583700.916345-2463-164743522518391/AnsiballZ_file.py'
Dec 01 10:08:21 compute-2 sudo[218912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:21 compute-2 python3.9[218914]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:21 compute-2 sudo[218912]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:21.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:21.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:08:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:08:21 compute-2 sudo[219083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skqnjfsddczzsfqgjwwcieaxtsahbqop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583701.5280712-2463-271653925170792/AnsiballZ_file.py'
Dec 01 10:08:21 compute-2 sudo[219083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:21 compute-2 podman[219039]: 2025-12-01 10:08:21.833450157 +0000 UTC m=+0.075235238 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:08:21 compute-2 python3.9[219088]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:21 compute-2 sudo[219083]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:22 compute-2 ceph-mon[76053]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:08:22 compute-2 sudo[219244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrqixhygyjodcztbolghpzqzgegvqvpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583702.1267772-2463-203105865222204/AnsiballZ_file.py'
Dec 01 10:08:22 compute-2 sudo[219244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:22 compute-2 python3.9[219246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:22 compute-2 sudo[219244]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:22 compute-2 sudo[219396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlxmsyysbuqhfqhicoyzchywtaxauujv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583702.7258577-2463-139629119409479/AnsiballZ_file.py'
Dec 01 10:08:22 compute-2 sudo[219396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:23 compute-2 python3.9[219398]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:23 compute-2 sudo[219396]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:23 compute-2 ceph-mon[76053]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:08:23 compute-2 sudo[219550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpzknpnefsfzbdfjwlnichhlxrrdger ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583703.3156087-2463-38376193643081/AnsiballZ_file.py'
Dec 01 10:08:23 compute-2 sudo[219550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:23.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:23 compute-2 python3.9[219552]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:23 compute-2 sudo[219550]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:24 compute-2 sudo[219702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjwyluutrfxpfxjdeaqasnyxfcnstqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583703.900961-2463-80400850270653/AnsiballZ_file.py'
Dec 01 10:08:24 compute-2 sudo[219702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:24 compute-2 python3.9[219704]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:24 compute-2 sudo[219702]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:24 compute-2 sudo[219854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzmjxmgjcvnbrycueichaxaqoaodjsve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583704.502164-2463-162352780698706/AnsiballZ_file.py'
Dec 01 10:08:24 compute-2 sudo[219854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:08:24 compute-2 python3.9[219856]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:08:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:24 compute-2 sudo[219854]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:25.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:25 compute-2 ceph-mon[76053]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:08:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:25.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:26 compute-2 sudo[220009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jracvvaldknscievyknapwvafohgfwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583706.5325942-2637-153115539784340/AnsiballZ_command.py'
Dec 01 10:08:26 compute-2 sudo[220009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:26 compute-2 python3.9[220011]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:27 compute-2 sudo[220009]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:27.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:27.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:27 compute-2 ceph-mon[76053]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:08:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:08:27 compute-2 python3.9[220165]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 01 10:08:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:28 compute-2 sudo[220315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfdyquueqmprckmqphzpoyzvouvhofyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583708.388514-2691-196571379238548/AnsiballZ_systemd_service.py'
Dec 01 10:08:28 compute-2 sudo[220315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:28 compute-2 python3.9[220317]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:08:28 compute-2 systemd[1]: Reloading.
Dec 01 10:08:29 compute-2 systemd-rc-local-generator[220345]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:08:29 compute-2 systemd-sysv-generator[220348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:08:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:29 compute-2 sudo[220315]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:29.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:29.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:29 compute-2 ceph-mon[76053]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:08:29 compute-2 sudo[220504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqczclafodobdamqxmrzvvswehkroaao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583709.6792445-2715-42643546558554/AnsiballZ_command.py'
Dec 01 10:08:29 compute-2 sudo[220504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100830 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:30 compute-2 python3.9[220506]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:30 compute-2 sudo[220504]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:30 compute-2 sudo[220657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgcfxkimxrkildhhjtoeinrajvilgael ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583710.3544056-2715-110509346858447/AnsiballZ_command.py'
Dec 01 10:08:30 compute-2 sudo[220657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:30 compute-2 python3.9[220659]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:30 compute-2 sudo[220657]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:08:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:31 compute-2 sudo[220811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spewqcmetdehtpwnbtfyvzhxtijxmoki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583710.9749386-2715-217620734501805/AnsiballZ_command.py'
Dec 01 10:08:31 compute-2 sudo[220811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:31 compute-2 python3.9[220813]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:31 compute-2 sudo[220811]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:31.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:31 compute-2 sudo[220965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blwbdyuajqrgsxiupvzmaunuhiszccty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583711.6426473-2715-90458508950074/AnsiballZ_command.py'
Dec 01 10:08:31 compute-2 sudo[220965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:31 compute-2 ceph-mon[76053]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:08:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:32 compute-2 python3.9[220967]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:32 compute-2 sudo[220965]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:32 compute-2 sudo[221118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chvwgiznmraaovppnkydjptdvtlyyazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583712.2539227-2715-8034441898547/AnsiballZ_command.py'
Dec 01 10:08:32 compute-2 sudo[221118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:32 compute-2 python3.9[221120]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:32 compute-2 sudo[221118]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:33 compute-2 sudo[221271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvsvxuaesabgypoewydfgztiauemlrqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583712.8820293-2715-263413219458774/AnsiballZ_command.py'
Dec 01 10:08:33 compute-2 sudo[221271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:33 compute-2 python3.9[221273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:33 compute-2 sudo[221271]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:33.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:33.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:33 compute-2 sudo[221426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiftuvdsckeffbvozptbigxhcpkipnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583713.4946935-2715-5948187741753/AnsiballZ_command.py'
Dec 01 10:08:33 compute-2 sudo[221426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:33 compute-2 ceph-mon[76053]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.4 KiB/s wr, 4 op/s
Dec 01 10:08:33 compute-2 python3.9[221428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:34 compute-2 sudo[221426]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:08:34 compute-2 sudo[221579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gctqeyatmpbtvtkwtddfhhicxnjqessc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583714.1316864-2715-209767513936852/AnsiballZ_command.py'
Dec 01 10:08:34 compute-2 sudo[221579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:34 compute-2 python3.9[221581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 01 10:08:34 compute-2 sudo[221579]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:35.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:35.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:36 compute-2 ceph-mon[76053]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100836 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:08:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:37 compute-2 ceph-mon[76053]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.8 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 01 10:08:37 compute-2 sudo[221735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heyvddadzollslowqketgckkloyossct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583717.003801-2922-32724635600419/AnsiballZ_file.py'
Dec 01 10:08:37 compute-2 sudo[221735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:37.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:37 compute-2 python3.9[221737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:37 compute-2 sudo[221735]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:37.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:37 compute-2 sudo[221888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcedreazcvovnndammazksxgyisvijpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583717.6397123-2922-93713959846531/AnsiballZ_file.py'
Dec 01 10:08:37 compute-2 sudo[221888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:38 compute-2 python3.9[221890]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:38 compute-2 sudo[221888]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:38 compute-2 sudo[222040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywswgicghdxjoufueojbfmoctapwbvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583718.2616467-2922-182379777671980/AnsiballZ_file.py'
Dec 01 10:08:38 compute-2 sudo[222040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:38 compute-2 python3.9[222042]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:38 compute-2 sudo[222040]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:39 compute-2 ceph-mon[76053]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:08:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:39.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:39 compute-2 sudo[222207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtkszyzkbtlijfgctpeqvqkqxuugkij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583719.275302-2988-215653347536147/AnsiballZ_file.py'
Dec 01 10:08:39 compute-2 sudo[222207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:39 compute-2 podman[222168]: 2025-12-01 10:08:39.585731347 +0000 UTC m=+0.062277894 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:08:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:39.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:39 compute-2 python3.9[222215]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:39 compute-2 sudo[222207]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:40 compute-2 sudo[222365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hezxyiiwfxoxknmhyculmeteypdkbwfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583719.9132855-2988-122053950497733/AnsiballZ_file.py'
Dec 01 10:08:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:40 compute-2 sudo[222365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:08:40 compute-2 python3.9[222367]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:40 compute-2 sudo[222365]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:40 compute-2 sudo[222517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxzziuigvgryzkdzefdhhjjxhjzgvioo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583720.525972-2988-168552851329860/AnsiballZ_file.py'
Dec 01 10:08:40 compute-2 sudo[222517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:40 compute-2 python3.9[222519]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:41 compute-2 sudo[222517]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:41 compute-2 sudo[222568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:08:41 compute-2 sudo[222568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:08:41 compute-2 sudo[222568]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:41 compute-2 ceph-mon[76053]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:08:41 compute-2 sudo[222696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsffrusdsdgnsrxrdkdpyatcekszmkgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583721.167346-2988-141921234494059/AnsiballZ_file.py'
Dec 01 10:08:41 compute-2 sudo[222696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:41.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:41 compute-2 python3.9[222698]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:41 compute-2 sudo[222696]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:42 compute-2 sudo[222848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siwoueuhanpbxvfzfcccporzmsltyzct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583721.7913225-2988-157899810255729/AnsiballZ_file.py'
Dec 01 10:08:42 compute-2 sudo[222848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:42 compute-2 python3.9[222850]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:42 compute-2 sudo[222848]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:42 compute-2 sudo[223000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzodcjjeahfzutxpxaqimgjkqbeguhgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583722.4178803-2988-279296525744875/AnsiballZ_file.py'
Dec 01 10:08:42 compute-2 sudo[223000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:42 compute-2 python3.9[223002]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:42 compute-2 sudo[223000]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:43 compute-2 sudo[223154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynovuxudppvmsndiyxtlqgwldxdvsgft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583723.0683444-2988-133647833361154/AnsiballZ_file.py'
Dec 01 10:08:43 compute-2 sudo[223154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:43.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:43 compute-2 ceph-mon[76053]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:08:43 compute-2 python3.9[223156]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:43 compute-2 sudo[223154]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:43.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:43 compute-2 podman[223157]: 2025-12-01 10:08:43.672079761 +0000 UTC m=+0.085966113 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:08:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:08:44 compute-2 kernel: ganesha.nfsd[205608]: segfault at 50 ip 00007ff45513d32e sp 00007ff41affc210 error 4 in libntirpc.so.5.8[7ff455122000+2c000] likely on CPU 4 (core 0, socket 4)
Dec 01 10:08:44 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:08:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy ignored for local
Dec 01 10:08:44 compute-2 systemd[1]: Started Process Core Dump (PID 223202/UID 0).
Dec 01 10:08:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:45.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:45 compute-2 ceph-mon[76053]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 01 10:08:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:45.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:46 compute-2 systemd-coredump[223203]: Process 196873 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 58:
                                                    #0  0x00007ff45513d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:08:46 compute-2 systemd[1]: systemd-coredump@12-223202-0.service: Deactivated successfully.
Dec 01 10:08:46 compute-2 systemd[1]: systemd-coredump@12-223202-0.service: Consumed 1.572s CPU time.
Dec 01 10:08:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:46 compute-2 podman[223210]: 2025-12-01 10:08:46.696654846 +0000 UTC m=+0.033136884 container died 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:08:46 compute-2 systemd[1]: var-lib-containers-storage-overlay-a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191-merged.mount: Deactivated successfully.
Dec 01 10:08:46 compute-2 podman[223210]: 2025-12-01 10:08:46.741550932 +0000 UTC m=+0.078032920 container remove 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:08:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:08:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:08:46 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.189s CPU time.
Dec 01 10:08:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:08:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:47.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:08:47 compute-2 ceph-mon[76053]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 1 op/s
Dec 01 10:08:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:47.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:49.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:49 compute-2 ceph-mon[76053]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:08:49 compute-2 sudo[223382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watwwexozcqkbfeomypramoslttwxucu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583729.355327-3313-208139648464005/AnsiballZ_getent.py'
Dec 01 10:08:49 compute-2 sudo[223382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:49 compute-2 python3.9[223384]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 01 10:08:50 compute-2 sudo[223382]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:50 compute-2 sudo[223535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecmxzgnnouzwmrdwfcgpckzplectlzhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583730.3434966-3337-146820825125415/AnsiballZ_group.py'
Dec 01 10:08:50 compute-2 sudo[223535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:50 compute-2 python3.9[223537]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 01 10:08:50 compute-2 groupadd[223538]: group added to /etc/group: name=nova, GID=42436
Dec 01 10:08:50 compute-2 groupadd[223538]: group added to /etc/gshadow: name=nova
Dec 01 10:08:50 compute-2 groupadd[223538]: new group: name=nova, GID=42436
Dec 01 10:08:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100850 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:08:50 compute-2 sudo[223535]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:51.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:51 compute-2 ceph-mon[76053]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:08:51 compute-2 sudo[223695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxpuscxhsptglandpylvyltydkfphfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583731.418751-3361-59020310630374/AnsiballZ_user.py'
Dec 01 10:08:51 compute-2 sudo[223695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:08:51 compute-2 podman[223697]: 2025-12-01 10:08:51.956880877 +0000 UTC m=+0.087429682 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:08:52 compute-2 python3.9[223698]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 01 10:08:52 compute-2 useradd[223725]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 01 10:08:52 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:08:52 compute-2 useradd[223725]: add 'nova' to group 'libvirt'
Dec 01 10:08:52 compute-2 useradd[223725]: add 'nova' to shadow group 'libvirt'
Dec 01 10:08:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:52 compute-2 sudo[223695]: pam_unix(sudo:session): session closed for user root
Dec 01 10:08:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:53 compute-2 sshd-session[223757]: Accepted publickey for zuul from 192.168.122.30 port 48062 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:08:53 compute-2 systemd-logind[795]: New session 54 of user zuul.
Dec 01 10:08:53 compute-2 systemd[1]: Started Session 54 of User zuul.
Dec 01 10:08:53 compute-2 sshd-session[223757]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:08:53 compute-2 sshd-session[223761]: Received disconnect from 192.168.122.30 port 48062:11: disconnected by user
Dec 01 10:08:53 compute-2 sshd-session[223761]: Disconnected from user zuul 192.168.122.30 port 48062
Dec 01 10:08:53 compute-2 sshd-session[223757]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:08:53 compute-2 systemd[1]: session-54.scope: Deactivated successfully.
Dec 01 10:08:53 compute-2 systemd-logind[795]: Session 54 logged out. Waiting for processes to exit.
Dec 01 10:08:53 compute-2 systemd-logind[795]: Removed session 54.
Dec 01 10:08:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:53.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:53 compute-2 ceph-mon[76053]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:08:54 compute-2 python3.9[223912]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:08:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:54 compute-2 python3.9[224033]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583733.5805023-3436-255433035300578/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:08:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:55 compute-2 python3.9[224183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:08:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:55 compute-2 python3.9[224261]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:55.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:55 compute-2 ceph-mon[76053]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:08:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:56 compute-2 python3.9[224411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:08:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:08:56 compute-2 python3.9[224532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583735.7713423-3436-273778939597384/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:57 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 13.
Dec 01 10:08:57 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:08:57 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.189s CPU time.
Dec 01 10:08:57 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:57 compute-2 podman[224703]: 2025-12-01 10:08:57.316416233 +0000 UTC m=+0.046064557 container create 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:08:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:08:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:08:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:08:57 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:08:57 compute-2 podman[224703]: 2025-12-01 10:08:57.378799189 +0000 UTC m=+0.108447533 container init 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 01 10:08:57 compute-2 podman[224703]: 2025-12-01 10:08:57.384316491 +0000 UTC m=+0.113964805 container start 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:08:57 compute-2 bash[224703]: 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64
Dec 01 10:08:57 compute-2 podman[224703]: 2025-12-01 10:08:57.29727148 +0000 UTC m=+0.026919824 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:08:57 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:08:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:08:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:08:57 compute-2 python3.9[224744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:08:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:57.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:57 compute-2 ceph-mon[76053]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:08:58 compute-2 python3.9[224909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583737.02035-3436-179379331532953/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:58 compute-2 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec 01 10:08:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:58 compute-2 python3.9[225059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:08:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:08:59 compute-2 python3.9[225180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583738.2166488-3436-103462433767925/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:08:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:08:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:08:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:59.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:08:59 compute-2 ceph-mon[76053]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:08:59 compute-2 python3.9[225332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:09:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:00 compute-2 python3.9[225453]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583739.3843422-3436-115297685155928/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:09:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:01 compute-2 sudo[225479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:09:01 compute-2 sudo[225479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:01 compute-2 sudo[225479]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:01.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:01.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:01 compute-2 ceph-mon[76053]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:09:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:02 compute-2 sudo[225630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgxahhamizceroaxxbrmjbohqzxxnbrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583742.33234-3685-160341808158253/AnsiballZ_file.py'
Dec 01 10:09:02 compute-2 sudo[225630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:02 compute-2 python3.9[225632]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:09:02 compute-2 sudo[225630]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:03 compute-2 sudo[225784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcplyuorxamsspnzivmtajhefshalufy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583743.1830537-3710-52437958598258/AnsiballZ_copy.py'
Dec 01 10:09:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:03.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:03 compute-2 sudo[225784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:03 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:09:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:03 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:09:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:03.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:03 compute-2 python3.9[225786]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:09:03 compute-2 sudo[225784]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:03 compute-2 ceph-mon[76053]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:09:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:04 compute-2 sudo[225936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duyvjkrauwfyacqwqzlohsuzghawdyfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583743.9727407-3733-214851425215136/AnsiballZ_stat.py'
Dec 01 10:09:04 compute-2 sudo[225936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:04 compute-2 python3.9[225938]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:04 compute-2 sudo[225936]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:09:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:09:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.695 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:09:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:05 compute-2 sudo[226089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnymxzsadyeoyauswpzvzwsmptffgvtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583744.9600804-3758-163902034562064/AnsiballZ_stat.py'
Dec 01 10:09:05 compute-2 sudo[226089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:05 compute-2 python3.9[226091]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:09:05 compute-2 sudo[226089]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:05.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:05.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:05 compute-2 sudo[226213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqxsipxvrlqkjxvcgbvgxmcgdarwyscv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583744.9600804-3758-163902034562064/AnsiballZ_copy.py'
Dec 01 10:09:05 compute-2 sudo[226213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:05 compute-2 ceph-mon[76053]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:09:05 compute-2 python3.9[226215]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764583744.9600804-3758-163902034562064/.source _original_basename=.vhybe949 follow=False checksum=db14e672cb4774dd20678b7f16304ab323199ab6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 01 10:09:06 compute-2 sudo[226213]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:07 compute-2 python3.9[226367]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:07.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:07.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:07 compute-2 python3.9[226521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:09:08 compute-2 sudo[226522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:09:08 compute-2 sudo[226522]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:08 compute-2 sudo[226522]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:08 compute-2 ceph-mon[76053]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:09:08 compute-2 sudo[226570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 01 10:09:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:08 compute-2 sudo[226570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:08 compute-2 sudo[226570]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:08 compute-2 python3.9[226694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583747.4528823-3835-51886259019503/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:09:08 compute-2 sudo[226713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:09:08 compute-2 sudo[226713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:08 compute-2 sudo[226713]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:08 compute-2 sudo[226750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:09:08 compute-2 sudo[226750]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:09 compute-2 sudo[226750]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:09 compute-2 python3.9[226944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 01 10:09:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:09.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:09 compute-2 ceph-mon[76053]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:09:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:09:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:09:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:09.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:09 compute-2 podman[227053]: 2025-12-01 10:09:09.880964278 +0000 UTC m=+0.061829912 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:09:10 compute-2 python3.9[227090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583748.7915459-3880-145683456269637/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 01 10:09:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff724000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:10 compute-2 ceph-mon[76053]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1022 B/s wr, 3 op/s
Dec 01 10:09:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:09:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:10 compute-2 sudo[227250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxupsknnhmeedmfvwtcpvajvnimcmrbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583750.621664-3931-2665915091260/AnsiballZ_container_config_data.py'
Dec 01 10:09:10 compute-2 sudo[227250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:11 compute-2 python3.9[227252]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 01 10:09:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:11 compute-2 sudo[227250]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:11.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:11 compute-2 sudo[227404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gylxiabqeluzruycppcmhvxjwarupwmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583751.526681-3958-208387467898037/AnsiballZ_container_config_hash.py'
Dec 01 10:09:11 compute-2 sudo[227404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:12 compute-2 python3.9[227406]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:12 compute-2 sudo[227404]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:12 compute-2 ceph-mon[76053]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:12 compute-2 sudo[227556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qghqwtqpwqynnqzgykqdzsumjwjfftrp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583752.567278-3988-82713365657720/AnsiballZ_edpm_container_manage.py'
Dec 01 10:09:12 compute-2 sudo[227556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100912 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:09:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:13 compute-2 python3[227558]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 10:09:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:14 compute-2 podman[227595]: 2025-12-01 10:09:14.439211219 +0000 UTC m=+0.075197407 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.572287) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754572479, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1437, "num_deletes": 251, "total_data_size": 3687020, "memory_usage": 3734208, "flush_reason": "Manual Compaction"}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754598555, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2398176, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19376, "largest_seqno": 20808, "table_properties": {"data_size": 2391991, "index_size": 3448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12805, "raw_average_key_size": 19, "raw_value_size": 2379753, "raw_average_value_size": 3683, "num_data_blocks": 152, "num_entries": 646, "num_filter_entries": 646, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583635, "oldest_key_time": 1764583635, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 26450 microseconds, and 12224 cpu microseconds.
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.598744) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2398176 bytes OK
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.598771) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604780) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604832) EVENT_LOG_v1 {"time_micros": 1764583754604823, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604859) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3680358, prev total WAL file size 3787052, number of live WAL files 2.
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.605900) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2341KB)], [36(12MB)]
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754606003, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15809306, "oldest_snapshot_seqno": -1}
Dec 01 10:09:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5032 keys, 13611005 bytes, temperature: kUnknown
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754693248, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13611005, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13576057, "index_size": 21270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128184, "raw_average_key_size": 25, "raw_value_size": 13483231, "raw_average_value_size": 2679, "num_data_blocks": 874, "num_entries": 5032, "num_filter_entries": 5032, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:09:14 compute-2 ceph-mon[76053]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 464 B/s wr, 2 op/s
Dec 01 10:09:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.693531) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13611005 bytes
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.695344) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.0 rd, 155.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.8 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(12.3) write-amplify(5.7) OK, records in: 5552, records dropped: 520 output_compression: NoCompression
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.695366) EVENT_LOG_v1 {"time_micros": 1764583754695356, "job": 20, "event": "compaction_finished", "compaction_time_micros": 87338, "compaction_time_cpu_micros": 34047, "output_level": 6, "num_output_files": 1, "total_output_size": 13611005, "num_input_records": 5552, "num_output_records": 5032, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754695846, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754698374, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.605816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:09:14 compute-2 sudo[227617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:09:14 compute-2 sudo[227617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:14 compute-2 sudo[227617]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:09:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:17 compute-2 ceph-mon[76053]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 464 B/s wr, 2 op/s
Dec 01 10:09:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:09:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3714 writes, 20K keys, 3714 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                           Cumulative WAL: 3714 writes, 3714 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1468 writes, 6982 keys, 1468 commit groups, 1.0 writes per commit group, ingest: 16.88 MB, 0.03 MB/s
                                           Interval WAL: 1468 writes, 1468 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    129.4      0.24              0.09        10    0.024       0      0       0.0       0.0
                                             L6      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.7    161.7    140.0      0.82              0.31         9    0.092     44K   4685       0.0       0.0
                                            Sum      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.7    125.4    137.6      1.06              0.40        19    0.056     44K   4685       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.6    136.5    136.1      0.43              0.17         8    0.054     22K   2385       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    161.7    140.0      0.82              0.31         9    0.092     44K   4685       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    130.7      0.24              0.09         9    0.026       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.030, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.1 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 7.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000139 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(406,7.18 MB,2.36205%) FilterBlock(19,131.05 KB,0.0420972%) IndexBlock(19,245.52 KB,0.0788689%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 10:09:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:17.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:17.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:19.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:19 compute-2 ceph-mon[76053]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 278 B/s rd, 92 B/s wr, 0 op/s
Dec 01 10:09:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:20 compute-2 ceph-mon[76053]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 278 B/s rd, 92 B/s wr, 0 op/s
Dec 01 10:09:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:21 compute-2 sudo[227688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:09:21 compute-2 sudo[227688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:21 compute-2 sudo[227688]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:21.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:21.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:23.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:24 compute-2 ceph-mon[76053]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:09:24 compute-2 podman[227714]: 2025-12-01 10:09:24.282644842 +0000 UTC m=+1.917800666 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 01 10:09:24 compute-2 podman[227571]: 2025-12-01 10:09:24.301900878 +0000 UTC m=+11.102163635 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 10:09:24 compute-2 podman[227774]: 2025-12-01 10:09:24.469738139 +0000 UTC m=+0.059956775 container create 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:09:24 compute-2 podman[227774]: 2025-12-01 10:09:24.436198665 +0000 UTC m=+0.026417321 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 10:09:24 compute-2 python3[227558]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 01 10:09:24 compute-2 sudo[227556]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:25 compute-2 sudo[227961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbnkfedtmuqadxlsbdfkfvarteifidtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583764.8233392-4012-228944635769431/AnsiballZ_stat.py'
Dec 01 10:09:25 compute-2 sudo[227961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:25 compute-2 ceph-mon[76053]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:09:25 compute-2 python3.9[227963]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:25 compute-2 sudo[227961]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:09:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:25.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:09:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:25.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:26 compute-2 ceph-mon[76053]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:26 compute-2 sudo[228117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yybkyccenbloqycwhwkzsodkayizctkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583766.0798638-4048-277520938192277/AnsiballZ_container_config_data.py'
Dec 01 10:09:26 compute-2 sudo[228117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:26 compute-2 python3.9[228119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 01 10:09:26 compute-2 sudo[228117]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:27 compute-2 sudo[228270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alqsugjjcvazeluzhdvawdjdqdoiyygd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583766.9482949-4075-206869819505554/AnsiballZ_container_config_hash.py'
Dec 01 10:09:27 compute-2 sudo[228270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:27 compute-2 python3.9[228272]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 01 10:09:27 compute-2 sudo[228270]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:27.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:28 compute-2 sudo[228423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfpevhrusvrcuouxkqniknendyivsszh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764583768.031073-4104-149582084689126/AnsiballZ_edpm_container_manage.py'
Dec 01 10:09:28 compute-2 sudo[228423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:28 compute-2 ceph-mon[76053]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:28 compute-2 python3[228425]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 01 10:09:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:28 compute-2 podman[228463]: 2025-12-01 10:09:28.784022359 +0000 UTC m=+0.054678938 container create afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:09:28 compute-2 podman[228463]: 2025-12-01 10:09:28.756229234 +0000 UTC m=+0.026885843 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 01 10:09:28 compute-2 python3[228425]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 01 10:09:28 compute-2 sudo[228423]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:29.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:29 compute-2 sudo[228654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwiehqkcbhjxtlqqpbdpxpzzvvewbuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583769.3856835-4129-212443240264176/AnsiballZ_stat.py'
Dec 01 10:09:29 compute-2 sudo[228654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:30 compute-2 python3.9[228656]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:30 compute-2 sudo[228654]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:31 compute-2 ceph-mon[76053]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:31 compute-2 sudo[228810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytledhlauubgdltszmstgbbqduunsvns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583771.1718972-4155-51439382094134/AnsiballZ_file.py'
Dec 01 10:09:31 compute-2 sudo[228810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:31.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:31 compute-2 python3.9[228812]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:09:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:31 compute-2 sudo[228810]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:31.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:32 compute-2 sudo[228961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yigpsqchtacbqmyujrskcqafvbrcwwkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583771.7330284-4155-254640005596444/AnsiballZ_copy.py'
Dec 01 10:09:32 compute-2 sudo[228961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:32 compute-2 python3.9[228963]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583771.7330284-4155-254640005596444/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 01 10:09:32 compute-2 sudo[228961]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:32 compute-2 sudo[229037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaloedqnncrcbqacitqqsjhhiqlajons ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583771.7330284-4155-254640005596444/AnsiballZ_systemd.py'
Dec 01 10:09:32 compute-2 sudo[229037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:32 compute-2 python3.9[229039]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 01 10:09:32 compute-2 systemd[1]: Reloading.
Dec 01 10:09:32 compute-2 systemd-sysv-generator[229069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:09:32 compute-2 systemd-rc-local-generator[229066]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:09:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:33 compute-2 ceph-mon[76053]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:09:33 compute-2 sudo[229037]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:33.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:33 compute-2 sudo[229149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsquotigftdraxhckkhlglbfpurqyyki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583771.7330284-4155-254640005596444/AnsiballZ_systemd.py'
Dec 01 10:09:33 compute-2 sudo[229149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:33.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:33 compute-2 python3.9[229151]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 01 10:09:34 compute-2 systemd[1]: Reloading.
Dec 01 10:09:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:34 compute-2 systemd-rc-local-generator[229180]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 01 10:09:34 compute-2 systemd-sysv-generator[229184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 01 10:09:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:34 compute-2 systemd[1]: Starting nova_compute container...
Dec 01 10:09:34 compute-2 ceph-mon[76053]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:34 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:09:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:34 compute-2 podman[229191]: 2025-12-01 10:09:34.512801271 +0000 UTC m=+0.103734182 container init afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm)
Dec 01 10:09:34 compute-2 podman[229191]: 2025-12-01 10:09:34.519676947 +0000 UTC m=+0.110609838 container start afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 01 10:09:34 compute-2 podman[229191]: nova_compute
Dec 01 10:09:34 compute-2 nova_compute[229206]: + sudo -E kolla_set_configs
Dec 01 10:09:34 compute-2 systemd[1]: Started nova_compute container.
Dec 01 10:09:34 compute-2 sudo[229149]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Validating config file
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying service configuration files
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Deleting /etc/ceph
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Creating directory /etc/ceph
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 10:09:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Writing out command to execute
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:34 compute-2 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 10:09:34 compute-2 nova_compute[229206]: ++ cat /run_command
Dec 01 10:09:34 compute-2 nova_compute[229206]: + CMD=nova-compute
Dec 01 10:09:34 compute-2 nova_compute[229206]: + ARGS=
Dec 01 10:09:34 compute-2 nova_compute[229206]: + sudo kolla_copy_cacerts
Dec 01 10:09:34 compute-2 nova_compute[229206]: + [[ ! -n '' ]]
Dec 01 10:09:34 compute-2 nova_compute[229206]: + . kolla_extend_start
Dec 01 10:09:34 compute-2 nova_compute[229206]: Running command: 'nova-compute'
Dec 01 10:09:34 compute-2 nova_compute[229206]: + echo 'Running command: '\''nova-compute'\'''
Dec 01 10:09:34 compute-2 nova_compute[229206]: + umask 0022
Dec 01 10:09:34 compute-2 nova_compute[229206]: + exec nova-compute
Dec 01 10:09:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:35.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:35.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:35 compute-2 python3.9[229370]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:36 compute-2 ceph-mon[76053]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:36 compute-2 python3.9[229520]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.056 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.056 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.057 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.057 229210 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 01 10:09:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.234 229210 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.250 229210 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.250 229210 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 01 10:09:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:37.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:37.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.760 229210 INFO nova.virt.driver [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 01 10:09:37 compute-2 nova_compute[229206]: 2025-12-01 10:09:37.910 229210 INFO nova.compute.provider_config [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 01 10:09:37 compute-2 python3.9[229676]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.069 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.070 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.070 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 WARNING oslo_config.cfg [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 01 10:09:38 compute-2 nova_compute[229206]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 01 10:09:38 compute-2 nova_compute[229206]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 01 10:09:38 compute-2 nova_compute[229206]: and ``live_migration_inbound_addr`` respectively.
Dec 01 10:09:38 compute-2 nova_compute[229206]: ).  Its value may be silently ignored in the future.
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.230 229210 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.258 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 01 10:09:38 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Dec 01 10:09:38 compute-2 systemd[1]: Started libvirt QEMU daemon.
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.335 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5e6bd2dca0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.339 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5e6bd2dca0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.340 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Connection event '1' reason 'None'
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.352 229210 WARNING nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Dec 01 10:09:38 compute-2 nova_compute[229206]: 2025-12-01 10:09:38.353 229210 DEBUG nova.virt.libvirt.volume.mount [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 01 10:09:38 compute-2 ceph-mon[76053]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:38 compute-2 sudo[229878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpimeukiihwgxexuehoeirvucdnfyjvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583778.3708508-4336-31130681349020/AnsiballZ_podman_container.py'
Dec 01 10:09:38 compute-2 sudo[229878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:39 compute-2 python3.9[229880]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 01 10:09:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:39 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:09:39 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:09:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:39 compute-2 sudo[229878]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.209 229210 INFO nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host capabilities <capabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]: 
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <host>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <uuid>c016036b-c202-4470-908b-16395dc3b958</uuid>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <arch>x86_64</arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model>EPYC-Rome-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <vendor>AMD</vendor>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <microcode version='16777317'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <signature family='23' model='49' stepping='0'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='x2apic'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='tsc-deadline'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='osxsave'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='hypervisor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='tsc_adjust'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='spec-ctrl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='stibp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='arch-capabilities'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='cmp_legacy'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='topoext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='virt-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='lbrv'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='tsc-scale'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='vmcb-clean'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='pause-filter'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='pfthreshold'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='svme-addr-chk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='rdctl-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='skip-l1dfl-vmentry'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='mds-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature name='pschange-mc-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <pages unit='KiB' size='4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <pages unit='KiB' size='2048'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <pages unit='KiB' size='1048576'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <power_management>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <suspend_mem/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </power_management>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <iommu support='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <migration_features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <live/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <uri_transports>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <uri_transport>tcp</uri_transport>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <uri_transport>rdma</uri_transport>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </uri_transports>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </migration_features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <topology>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <cells num='1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <cell id='0'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <memory unit='KiB'>7864316</memory>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <pages unit='KiB' size='2048'>0</pages>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <distances>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <sibling id='0' value='10'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           </distances>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           <cpus num='8'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:           </cpus>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         </cell>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </cells>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </topology>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <cache>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </cache>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <secmodel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model>selinux</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <doi>0</doi>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </secmodel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <secmodel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model>dac</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <doi>0</doi>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </secmodel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </host>
Dec 01 10:09:39 compute-2 nova_compute[229206]: 
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <guest>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <os_type>hvm</os_type>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <arch name='i686'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <wordsize>32</wordsize>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <domain type='qemu'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <domain type='kvm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <pae/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <nonpae/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <acpi default='on' toggle='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <apic default='on' toggle='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <cpuselection/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <deviceboot/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <disksnapshot default='on' toggle='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <externalSnapshot/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </guest>
Dec 01 10:09:39 compute-2 nova_compute[229206]: 
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <guest>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <os_type>hvm</os_type>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <arch name='x86_64'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <wordsize>64</wordsize>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <domain type='qemu'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <domain type='kvm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <acpi default='on' toggle='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <apic default='on' toggle='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <cpuselection/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <deviceboot/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <disksnapshot default='on' toggle='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <externalSnapshot/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </guest>
Dec 01 10:09:39 compute-2 nova_compute[229206]: 
Dec 01 10:09:39 compute-2 nova_compute[229206]: </capabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]: 
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.216 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.240 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 01 10:09:39 compute-2 nova_compute[229206]: <domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <domain>kvm</domain>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <arch>i686</arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <vcpu max='240'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <iothreads supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <os supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='firmware'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <loader supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>rom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pflash</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='readonly'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>yes</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='secure'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </loader>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </os>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='maximumMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <vendor>AMD</vendor>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='succor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='custom' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-128'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-256'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-512'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <memoryBacking supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='sourceType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>anonymous</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>memfd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </memoryBacking>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <disk supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='diskDevice'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>disk</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cdrom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>floppy</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>lun</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ide</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>fdc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>sata</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </disk>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <graphics supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vnc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egl-headless</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </graphics>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <video supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='modelType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vga</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cirrus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>none</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>bochs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ramfb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </video>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hostdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='mode'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>subsystem</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='startupPolicy'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>mandatory</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>requisite</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>optional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='subsysType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pci</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='capsType'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='pciBackend'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hostdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <rng supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>random</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </rng>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <filesystem supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='driverType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>path</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>handle</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtiofs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </filesystem>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <tpm supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-tis</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-crb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emulator</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>external</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendVersion'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>2.0</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </tpm>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <redirdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </redirdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <channel supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </channel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <crypto supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </crypto>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <interface supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>passt</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </interface>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <panic supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>isa</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>hyperv</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </panic>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <console supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>null</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dev</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pipe</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stdio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>udp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tcp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu-vdagent</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </console>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <gic supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <genid supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backup supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <async-teardown supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <ps2 supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sev supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sgx supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hyperv supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='features'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>relaxed</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vapic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>spinlocks</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vpindex</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>runtime</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>synic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stimer</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reset</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vendor_id</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>frequencies</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reenlightenment</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tlbflush</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ipi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>avic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emsr_bitmap</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>xmm_input</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hyperv>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <launchSecurity supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='sectype'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tdx</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </launchSecurity>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]: </domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.247 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 01 10:09:39 compute-2 nova_compute[229206]: <domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <domain>kvm</domain>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <arch>i686</arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <vcpu max='4096'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <iothreads supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <os supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='firmware'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <loader supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>rom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pflash</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='readonly'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>yes</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='secure'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </loader>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </os>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='maximumMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <vendor>AMD</vendor>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='succor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='custom' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-128'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-256'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-512'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <memoryBacking supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='sourceType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>anonymous</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>memfd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </memoryBacking>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <disk supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='diskDevice'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>disk</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cdrom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>floppy</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>lun</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>fdc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>sata</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </disk>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <graphics supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vnc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egl-headless</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </graphics>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <video supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='modelType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vga</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cirrus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>none</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>bochs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ramfb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </video>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hostdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='mode'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>subsystem</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='startupPolicy'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>mandatory</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>requisite</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>optional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='subsysType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pci</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='capsType'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='pciBackend'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hostdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <rng supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>random</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </rng>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <filesystem supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='driverType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>path</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>handle</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtiofs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </filesystem>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <tpm supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-tis</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-crb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emulator</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>external</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendVersion'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>2.0</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </tpm>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <redirdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </redirdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <channel supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </channel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <crypto supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </crypto>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <interface supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>passt</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </interface>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <panic supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>isa</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>hyperv</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </panic>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <console supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>null</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dev</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pipe</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stdio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>udp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tcp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu-vdagent</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </console>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <gic supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <genid supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backup supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <async-teardown supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <ps2 supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sev supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sgx supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hyperv supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='features'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>relaxed</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vapic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>spinlocks</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vpindex</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>runtime</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>synic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stimer</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reset</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vendor_id</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>frequencies</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reenlightenment</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tlbflush</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ipi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>avic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emsr_bitmap</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>xmm_input</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hyperv>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <launchSecurity supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='sectype'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tdx</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </launchSecurity>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]: </domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.276 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.281 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 01 10:09:39 compute-2 nova_compute[229206]: <domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <domain>kvm</domain>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <arch>x86_64</arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <vcpu max='240'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <iothreads supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <os supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='firmware'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <loader supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>rom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pflash</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='readonly'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>yes</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='secure'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </loader>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </os>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='maximumMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <vendor>AMD</vendor>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='succor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='custom' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-128'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-256'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-512'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <memoryBacking supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='sourceType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>anonymous</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>memfd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </memoryBacking>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <disk supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='diskDevice'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>disk</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cdrom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>floppy</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>lun</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ide</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>fdc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>sata</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </disk>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <graphics supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vnc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egl-headless</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </graphics>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <video supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='modelType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vga</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cirrus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>none</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>bochs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ramfb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </video>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hostdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='mode'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>subsystem</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='startupPolicy'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>mandatory</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>requisite</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>optional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='subsysType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pci</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='capsType'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='pciBackend'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hostdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <rng supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>random</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </rng>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <filesystem supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='driverType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>path</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>handle</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtiofs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </filesystem>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <tpm supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-tis</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-crb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emulator</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>external</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendVersion'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>2.0</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </tpm>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <redirdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </redirdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <channel supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </channel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <crypto supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </crypto>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <interface supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>passt</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </interface>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <panic supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>isa</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>hyperv</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </panic>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <console supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>null</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dev</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pipe</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stdio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>udp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tcp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu-vdagent</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </console>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <gic supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <genid supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backup supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <async-teardown supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <ps2 supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sev supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sgx supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hyperv supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='features'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>relaxed</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vapic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>spinlocks</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vpindex</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>runtime</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>synic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stimer</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reset</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vendor_id</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>frequencies</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reenlightenment</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tlbflush</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ipi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>avic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emsr_bitmap</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>xmm_input</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hyperv>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <launchSecurity supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='sectype'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tdx</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </launchSecurity>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]: </domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.340 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 01 10:09:39 compute-2 nova_compute[229206]: <domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <domain>kvm</domain>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <arch>x86_64</arch>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <vcpu max='4096'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <iothreads supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <os supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='firmware'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>efi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <loader supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>rom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pflash</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='readonly'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>yes</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='secure'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>yes</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>no</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </loader>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </os>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='maximumMigratable'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>on</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>off</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <vendor>AMD</vendor>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='succor'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <mode name='custom' supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Denverton-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='auto-ibrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amd-psfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='stibp-always-on'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='EPYC-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-128'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-256'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx10-512'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='prefetchiti'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Haswell-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512er'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512pf'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fma4'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tbm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xop'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='amx-tile'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-bf16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-fp16'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bitalg'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrc'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fzrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='la57'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='taa-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xfd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ifma'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cmpccxadd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fbsdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='fsrs'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ibrs-all'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mcdt-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pbrsb-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='psdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='serialize'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vaes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='hle'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='rtm'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512bw'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512cd'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512dq'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512f'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='avx512vl'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='invpcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pcid'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='pku'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='mpx'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='core-capability'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='split-lock-detect'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='cldemote'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='erms'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='gfni'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdir64b'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='movdiri'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='xsaves'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='athlon-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='core2duo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='coreduo-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='n270-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='ss'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <blockers model='phenom-v1'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnow'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <feature name='3dnowext'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </blockers>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </mode>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </cpu>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <memoryBacking supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <enum name='sourceType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>anonymous</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <value>memfd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </memoryBacking>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <disk supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='diskDevice'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>disk</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cdrom</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>floppy</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>lun</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>fdc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>sata</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </disk>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <graphics supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vnc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egl-headless</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </graphics>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <video supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='modelType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vga</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>cirrus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>none</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>bochs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ramfb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </video>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hostdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='mode'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>subsystem</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='startupPolicy'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>mandatory</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>requisite</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>optional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='subsysType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pci</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>scsi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='capsType'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='pciBackend'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hostdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <rng supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtio-non-transitional</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>random</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>egd</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </rng>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <filesystem supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='driverType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>path</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>handle</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>virtiofs</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </filesystem>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <tpm supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-tis</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tpm-crb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emulator</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>external</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendVersion'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>2.0</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </tpm>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <redirdev supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='bus'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>usb</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </redirdev>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <channel supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </channel>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <crypto supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendModel'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>builtin</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </crypto>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <interface supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='backendType'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>default</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>passt</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </interface>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <panic supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='model'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>isa</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>hyperv</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </panic>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <console supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='type'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>null</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vc</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pty</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dev</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>file</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>pipe</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stdio</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>udp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tcp</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>unix</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>qemu-vdagent</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>dbus</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </console>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </devices>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   <features>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <gic supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <genid supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <backup supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <async-teardown supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <ps2 supported='yes'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sev supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <sgx supported='no'/>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <hyperv supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='features'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>relaxed</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vapic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>spinlocks</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vpindex</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>runtime</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>synic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>stimer</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reset</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>vendor_id</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>frequencies</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>reenlightenment</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tlbflush</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>ipi</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>avic</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>emsr_bitmap</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>xmm_input</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </defaults>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </hyperv>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     <launchSecurity supported='yes'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       <enum name='sectype'>
Dec 01 10:09:39 compute-2 nova_compute[229206]:         <value>tdx</value>
Dec 01 10:09:39 compute-2 nova_compute[229206]:       </enum>
Dec 01 10:09:39 compute-2 nova_compute[229206]:     </launchSecurity>
Dec 01 10:09:39 compute-2 nova_compute[229206]:   </features>
Dec 01 10:09:39 compute-2 nova_compute[229206]: </domainCapabilities>
Dec 01 10:09:39 compute-2 nova_compute[229206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.429 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 INFO nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Secure Boot support detected
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.433 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.433 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.444 229210 DEBUG nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.469 229210 INFO nova.virt.node [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Determined node identity 801130cb-2e08-4a6f-b53c-1300fad37b0c from /var/lib/nova/compute_id
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.488 229210 WARNING nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Compute nodes ['801130cb-2e08-4a6f-b53c-1300fad37b0c'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.516 229210 INFO nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 01 10:09:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:39.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.552 229210 WARNING nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.552 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG nova.compute.resource_tracker [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:09:39 compute-2 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.processutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:09:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:39.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:39 compute-2 sudo[230083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-civigqiybxfbabfuizxmknpnknlgysjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583779.5590777-4360-32293540504736/AnsiballZ_systemd.py'
Dec 01 10:09:39 compute-2 sudo[230083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:09:40 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/217717765' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:40 compute-2 nova_compute[229206]: 2025-12-01 10:09:40.050 229210 DEBUG oslo_concurrency.processutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:09:40 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Dec 01 10:09:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:40 compute-2 systemd[1]: Started libvirt nodedev daemon.
Dec 01 10:09:40 compute-2 python3.9[230085]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 01 10:09:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:40 compute-2 systemd[1]: Stopping nova_compute container...
Dec 01 10:09:40 compute-2 ceph-mon[76053]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:40 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3776919133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:40 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/217717765' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:09:40 compute-2 podman[230088]: 2025-12-01 10:09:40.214853223 +0000 UTC m=+0.103218578 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:09:40 compute-2 nova_compute[229206]: 2025-12-01 10:09:40.247 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 10:09:40 compute-2 nova_compute[229206]: 2025-12-01 10:09:40.248 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 10:09:40 compute-2 nova_compute[229206]: 2025-12-01 10:09:40.248 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 10:09:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:40 compute-2 virtqemud[229722]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 01 10:09:40 compute-2 virtqemud[229722]: hostname: compute-2
Dec 01 10:09:40 compute-2 virtqemud[229722]: End of file while reading data: Input/output error
Dec 01 10:09:40 compute-2 systemd[1]: libpod-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682.scope: Deactivated successfully.
Dec 01 10:09:40 compute-2 systemd[1]: libpod-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682.scope: Consumed 4.127s CPU time.
Dec 01 10:09:40 compute-2 podman[230129]: 2025-12-01 10:09:40.865315646 +0000 UTC m=+0.664583967 container died afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, managed_by=edpm_ansible)
Dec 01 10:09:40 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682-userdata-shm.mount: Deactivated successfully.
Dec 01 10:09:40 compute-2 systemd[1]: var-lib-containers-storage-overlay-cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910-merged.mount: Deactivated successfully.
Dec 01 10:09:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:41 compute-2 sudo[230163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:09:41 compute-2 sudo[230163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:09:41 compute-2 sudo[230163]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:41.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:41 compute-2 podman[230129]: 2025-12-01 10:09:41.557288698 +0000 UTC m=+1.356556989 container cleanup afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:09:41 compute-2 podman[230129]: nova_compute
Dec 01 10:09:41 compute-2 podman[230188]: nova_compute
Dec 01 10:09:41 compute-2 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 01 10:09:41 compute-2 systemd[1]: Stopped nova_compute container.
Dec 01 10:09:41 compute-2 systemd[1]: Starting nova_compute container...
Dec 01 10:09:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:41.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:41 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:09:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:41 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:41 compute-2 podman[230201]: 2025-12-01 10:09:41.739159829 +0000 UTC m=+0.091420304 container init afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 10:09:41 compute-2 podman[230201]: 2025-12-01 10:09:41.749171647 +0000 UTC m=+0.101432092 container start afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 01 10:09:41 compute-2 podman[230201]: nova_compute
Dec 01 10:09:41 compute-2 nova_compute[230216]: + sudo -E kolla_set_configs
Dec 01 10:09:41 compute-2 systemd[1]: Started nova_compute container.
Dec 01 10:09:41 compute-2 sudo[230083]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Validating config file
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying service configuration files
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /etc/ceph
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Creating directory /etc/ceph
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Writing out command to execute
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:41 compute-2 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 01 10:09:41 compute-2 nova_compute[230216]: ++ cat /run_command
Dec 01 10:09:41 compute-2 nova_compute[230216]: + CMD=nova-compute
Dec 01 10:09:41 compute-2 nova_compute[230216]: + ARGS=
Dec 01 10:09:41 compute-2 nova_compute[230216]: + sudo kolla_copy_cacerts
Dec 01 10:09:41 compute-2 nova_compute[230216]: + [[ ! -n '' ]]
Dec 01 10:09:41 compute-2 nova_compute[230216]: + . kolla_extend_start
Dec 01 10:09:41 compute-2 nova_compute[230216]: + echo 'Running command: '\''nova-compute'\'''
Dec 01 10:09:41 compute-2 nova_compute[230216]: Running command: 'nova-compute'
Dec 01 10:09:41 compute-2 nova_compute[230216]: + umask 0022
Dec 01 10:09:41 compute-2 nova_compute[230216]: + exec nova-compute
Dec 01 10:09:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:42 compute-2 ceph-mon[76053]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:09:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:43 compute-2 sudo[230382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxwenapplhyzzkvygagdaynvelduagsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764583783.1835327-4387-51082553363669/AnsiballZ_podman_container.py'
Dec 01 10:09:43 compute-2 sudo[230382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:09:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:43.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:43.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:43 compute-2 python3.9[230384]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 01 10:09:43 compute-2 systemd[1]: Started libpod-conmon-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope.
Dec 01 10:09:43 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:09:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 01 10:09:43 compute-2 podman[230409]: 2025-12-01 10:09:43.972570468 +0000 UTC m=+0.125570983 container init 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 01 10:09:43 compute-2 podman[230409]: 2025-12-01 10:09:43.982240227 +0000 UTC m=+0.135240712 container start 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 10:09:43 compute-2 python3.9[230384]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.016 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.016 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.017 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.017 230220 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Applying nova statedir ownership
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 01 10:09:44 compute-2 nova_compute_init[230432]: INFO:nova_statedir:Nova statedir ownership complete
Dec 01 10:09:44 compute-2 systemd[1]: libpod-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope: Deactivated successfully.
Dec 01 10:09:44 compute-2 podman[230446]: 2025-12-01 10:09:44.089787386 +0000 UTC m=+0.024709038 container died 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 01 10:09:44 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc-userdata-shm.mount: Deactivated successfully.
Dec 01 10:09:44 compute-2 systemd[1]: var-lib-containers-storage-overlay-eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec-merged.mount: Deactivated successfully.
Dec 01 10:09:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:44 compute-2 podman[230446]: 2025-12-01 10:09:44.123791191 +0000 UTC m=+0.058712823 container cleanup 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init)
Dec 01 10:09:44 compute-2 systemd[1]: libpod-conmon-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope: Deactivated successfully.
Dec 01 10:09:44 compute-2 sudo[230382]: pam_unix(sudo:session): session closed for user root
Dec 01 10:09:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.169 230220 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:09:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.195 230220 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.196 230220 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 01 10:09:44 compute-2 ceph-mon[76053]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:44 compute-2 nova_compute[230216]: 2025-12-01 10:09:44.904 230220 INFO nova.virt.driver [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 01 10:09:44 compute-2 sshd-session[200921]: Connection closed by 192.168.122.30 port 47414
Dec 01 10:09:44 compute-2 sshd-session[200918]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:09:44 compute-2 systemd[1]: session-53.scope: Deactivated successfully.
Dec 01 10:09:44 compute-2 systemd[1]: session-53.scope: Consumed 2min 23.692s CPU time.
Dec 01 10:09:44 compute-2 systemd-logind[795]: Session 53 logged out. Waiting for processes to exit.
Dec 01 10:09:44 compute-2 systemd-logind[795]: Removed session 53.
Dec 01 10:09:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.030 230220 INFO nova.compute.provider_config [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 01 10:09:45 compute-2 podman[230499]: 2025-12-01 10:09:45.050427432 +0000 UTC m=+0.071786989 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 10:09:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:45.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:09:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:45.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.727 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.728 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.728 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 WARNING oslo_config.cfg [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 01 10:09:45 compute-2 nova_compute[230216]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 01 10:09:45 compute-2 nova_compute[230216]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 01 10:09:45 compute-2 nova_compute[230216]: and ``live_migration_inbound_addr`` respectively.
Dec 01 10:09:45 compute-2 nova_compute[230216]: ).  Its value may be silently ignored in the future.
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.892 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.892 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.932 230220 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.965 230220 INFO nova.virt.node [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Determined node identity 801130cb-2e08-4a6f-b53c-1300fad37b0c from /var/lib/nova/compute_id
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.966 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.966 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.967 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.967 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.986 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f24b48e0eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.988 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f24b48e0eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.989 230220 INFO nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Connection event '1' reason 'None'
Dec 01 10:09:45 compute-2 nova_compute[230216]: 2025-12-01 10:09:45.997 230220 INFO nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host capabilities <capabilities>
Dec 01 10:09:45 compute-2 nova_compute[230216]: 
Dec 01 10:09:45 compute-2 nova_compute[230216]:   <host>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <uuid>c016036b-c202-4470-908b-16395dc3b958</uuid>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <cpu>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <arch>x86_64</arch>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <model>EPYC-Rome-v4</model>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <vendor>AMD</vendor>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <microcode version='16777317'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <signature family='23' model='49' stepping='0'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='x2apic'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='tsc-deadline'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='osxsave'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='hypervisor'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='tsc_adjust'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='spec-ctrl'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='stibp'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='arch-capabilities'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='ssbd'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='cmp_legacy'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='topoext'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='virt-ssbd'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='lbrv'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='tsc-scale'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='vmcb-clean'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='pause-filter'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='pfthreshold'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='svme-addr-chk'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='rdctl-no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='skip-l1dfl-vmentry'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='mds-no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <feature name='pschange-mc-no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <pages unit='KiB' size='4'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <pages unit='KiB' size='2048'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <pages unit='KiB' size='1048576'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </cpu>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <power_management>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <suspend_mem/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </power_management>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <iommu support='no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <migration_features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <live/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <uri_transports>
Dec 01 10:09:45 compute-2 nova_compute[230216]:         <uri_transport>tcp</uri_transport>
Dec 01 10:09:45 compute-2 nova_compute[230216]:         <uri_transport>rdma</uri_transport>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       </uri_transports>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </migration_features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <topology>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <cells num='1'>
Dec 01 10:09:45 compute-2 nova_compute[230216]:         <cell id='0'>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <memory unit='KiB'>7864316</memory>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <pages unit='KiB' size='2048'>0</pages>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <distances>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <sibling id='0' value='10'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           </distances>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           <cpus num='8'>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:           </cpus>
Dec 01 10:09:45 compute-2 nova_compute[230216]:         </cell>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       </cells>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </topology>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <cache>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </cache>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <secmodel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <model>selinux</model>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <doi>0</doi>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </secmodel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <secmodel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <model>dac</model>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <doi>0</doi>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </secmodel>
Dec 01 10:09:45 compute-2 nova_compute[230216]:   </host>
Dec 01 10:09:45 compute-2 nova_compute[230216]: 
Dec 01 10:09:45 compute-2 nova_compute[230216]:   <guest>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <os_type>hvm</os_type>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <arch name='i686'>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <wordsize>32</wordsize>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <domain type='qemu'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <domain type='kvm'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </arch>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <pae/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <nonpae/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <acpi default='on' toggle='yes'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <apic default='on' toggle='no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <cpuselection/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <deviceboot/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <disksnapshot default='on' toggle='no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <externalSnapshot/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:   </guest>
Dec 01 10:09:45 compute-2 nova_compute[230216]: 
Dec 01 10:09:45 compute-2 nova_compute[230216]:   <guest>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <os_type>hvm</os_type>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <arch name='x86_64'>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <wordsize>64</wordsize>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <domain type='qemu'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <domain type='kvm'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </arch>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     <features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <acpi default='on' toggle='yes'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <apic default='on' toggle='no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <cpuselection/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <deviceboot/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <disksnapshot default='on' toggle='no'/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:       <externalSnapshot/>
Dec 01 10:09:45 compute-2 nova_compute[230216]:     </features>
Dec 01 10:09:45 compute-2 nova_compute[230216]:   </guest>
Dec 01 10:09:45 compute-2 nova_compute[230216]: 
Dec 01 10:09:45 compute-2 nova_compute[230216]: </capabilities>
Dec 01 10:09:45 compute-2 nova_compute[230216]: 
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.005 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.010 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 01 10:09:46 compute-2 nova_compute[230216]: <domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <domain>kvm</domain>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <arch>i686</arch>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <vcpu max='240'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <iothreads supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <os supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='firmware'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <loader supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>rom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pflash</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='readonly'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>yes</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='secure'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </loader>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </os>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='maximumMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <vendor>AMD</vendor>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='succor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='custom' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-128'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-256'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-512'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <memoryBacking supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='sourceType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>anonymous</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>memfd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </memoryBacking>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <disk supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='diskDevice'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>disk</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cdrom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>floppy</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>lun</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ide</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>fdc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>sata</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </disk>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <graphics supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vnc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egl-headless</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </graphics>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <video supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='modelType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vga</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cirrus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>none</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>bochs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ramfb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </video>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hostdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='mode'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>subsystem</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='startupPolicy'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>mandatory</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>requisite</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>optional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='subsysType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pci</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='capsType'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='pciBackend'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hostdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <rng supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>random</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </rng>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <filesystem supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='driverType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>path</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>handle</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtiofs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </filesystem>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <tpm supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-tis</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-crb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emulator</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>external</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendVersion'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>2.0</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </tpm>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <redirdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </redirdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <channel supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </channel>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <crypto supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </crypto>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <interface supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>passt</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </interface>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <panic supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>isa</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>hyperv</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </panic>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <console supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>null</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dev</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pipe</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stdio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>udp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tcp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu-vdagent</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </console>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <features>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <gic supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <genid supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backup supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <async-teardown supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <ps2 supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sev supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sgx supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hyperv supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='features'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>relaxed</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vapic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>spinlocks</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vpindex</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>runtime</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>synic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stimer</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reset</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vendor_id</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>frequencies</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reenlightenment</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tlbflush</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ipi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>avic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emsr_bitmap</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>xmm_input</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hyperv>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <launchSecurity supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='sectype'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tdx</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </launchSecurity>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </features>
Dec 01 10:09:46 compute-2 nova_compute[230216]: </domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.016 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 01 10:09:46 compute-2 nova_compute[230216]: <domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <domain>kvm</domain>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <arch>i686</arch>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <vcpu max='4096'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <iothreads supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <os supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='firmware'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <loader supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>rom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pflash</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='readonly'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>yes</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='secure'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </loader>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </os>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='maximumMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <vendor>AMD</vendor>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='succor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='custom' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-128'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-256'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-512'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100946 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:46 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <memoryBacking supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='sourceType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>anonymous</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>memfd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </memoryBacking>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <disk supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='diskDevice'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>disk</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cdrom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>floppy</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>lun</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>fdc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>sata</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </disk>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <graphics supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vnc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egl-headless</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </graphics>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <video supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='modelType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vga</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cirrus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>none</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>bochs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ramfb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </video>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hostdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='mode'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>subsystem</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='startupPolicy'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>mandatory</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>requisite</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>optional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='subsysType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pci</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='capsType'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='pciBackend'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hostdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <rng supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>random</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </rng>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <filesystem supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='driverType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>path</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>handle</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtiofs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </filesystem>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <tpm supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-tis</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-crb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emulator</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>external</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendVersion'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>2.0</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </tpm>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <redirdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </redirdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <channel supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </channel>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <crypto supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </crypto>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <interface supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>passt</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </interface>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <panic supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>isa</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>hyperv</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </panic>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <console supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>null</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dev</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pipe</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stdio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>udp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tcp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu-vdagent</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </console>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <features>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <gic supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <genid supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backup supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <async-teardown supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <ps2 supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sev supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sgx supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hyperv supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='features'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>relaxed</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vapic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>spinlocks</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vpindex</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>runtime</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>synic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stimer</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reset</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vendor_id</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>frequencies</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reenlightenment</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tlbflush</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ipi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>avic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emsr_bitmap</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>xmm_input</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hyperv>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <launchSecurity supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='sectype'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tdx</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </launchSecurity>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </features>
Dec 01 10:09:46 compute-2 nova_compute[230216]: </domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.055 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.060 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 01 10:09:46 compute-2 nova_compute[230216]: <domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <domain>kvm</domain>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <arch>x86_64</arch>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <vcpu max='240'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <iothreads supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <os supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='firmware'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <loader supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>rom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pflash</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='readonly'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>yes</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='secure'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </loader>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </os>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='maximumMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <vendor>AMD</vendor>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='succor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='custom' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-128'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-256'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-512'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <memoryBacking supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='sourceType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>anonymous</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>memfd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </memoryBacking>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <disk supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='diskDevice'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>disk</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cdrom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>floppy</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>lun</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ide</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>fdc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>sata</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </disk>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <graphics supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vnc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egl-headless</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </graphics>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <video supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='modelType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vga</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cirrus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>none</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>bochs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ramfb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </video>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hostdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='mode'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>subsystem</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='startupPolicy'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>mandatory</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>requisite</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>optional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='subsysType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pci</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='capsType'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='pciBackend'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hostdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <rng supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>random</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </rng>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <filesystem supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='driverType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>path</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>handle</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtiofs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </filesystem>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <tpm supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-tis</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-crb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emulator</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>external</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendVersion'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>2.0</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </tpm>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <redirdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </redirdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <channel supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </channel>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <crypto supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </crypto>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <interface supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>passt</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </interface>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <panic supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>isa</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>hyperv</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </panic>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <console supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>null</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dev</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pipe</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stdio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>udp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tcp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu-vdagent</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </console>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <features>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <gic supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <genid supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backup supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <async-teardown supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <ps2 supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sev supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sgx supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hyperv supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='features'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>relaxed</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vapic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>spinlocks</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vpindex</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>runtime</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>synic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stimer</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reset</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vendor_id</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>frequencies</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reenlightenment</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tlbflush</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ipi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>avic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emsr_bitmap</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>xmm_input</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hyperv>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <launchSecurity supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='sectype'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tdx</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </launchSecurity>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </features>
Dec 01 10:09:46 compute-2 nova_compute[230216]: </domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.148 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 01 10:09:46 compute-2 nova_compute[230216]: <domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <path>/usr/libexec/qemu-kvm</path>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <domain>kvm</domain>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <arch>x86_64</arch>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <vcpu max='4096'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <iothreads supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <os supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='firmware'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>efi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <loader supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>rom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pflash</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='readonly'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>yes</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='secure'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>yes</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>no</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </loader>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </os>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-passthrough' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='hostPassthroughMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='maximum' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='maximumMigratable'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>on</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>off</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='host-model' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <vendor>AMD</vendor>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='x2apic'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-deadline'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='hypervisor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc_adjust'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='spec-ctrl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='stibp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='cmp_legacy'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='overflow-recov'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='succor'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='amd-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='virt-ssbd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lbrv'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='tsc-scale'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='vmcb-clean'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='flushbyasid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pause-filter'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='pfthreshold'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='svme-addr-chk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <feature policy='disable' name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <mode name='custom' supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Broadwell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cascadelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Cooperlake-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Denverton-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Dhyana-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Genoa-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='auto-ibrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Milan-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amd-psfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='no-nested-data-bp'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='null-sel-clr-base'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='stibp-always-on'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-Rome-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='EPYC-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='GraniteRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-128'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-256'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx10-512'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='prefetchiti'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Haswell-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-noTSX'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v6'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Icelake-Server-v7'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='IvyBridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='KnightsMill-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4fmaps'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-4vnniw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512er'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512pf'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G4-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Opteron_G5-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fma4'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tbm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xop'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SapphireRapids-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='amx-tile'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-bf16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-fp16'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512-vpopcntdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bitalg'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vbmi2'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrc'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fzrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='la57'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='taa-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='tsx-ldtrk'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xfd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='SierraForest-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ifma'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-ne-convert'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx-vnni-int8'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='bus-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cmpccxadd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fbsdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='fsrs'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ibrs-all'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mcdt-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pbrsb-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='psdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='sbdr-ssdp-no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='serialize'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vaes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='vpclmulqdq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Client-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='hle'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='rtm'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Skylake-Server-v5'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512bw'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512cd'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512dq'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512f'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='avx512vl'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='invpcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pcid'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='pku'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='mpx'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v2'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v3'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='core-capability'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='split-lock-detect'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='Snowridge-v4'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='cldemote'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='erms'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='gfni'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdir64b'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='movdiri'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='xsaves'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='athlon-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='core2duo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='coreduo-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='n270-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='ss'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <blockers model='phenom-v1'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnow'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <feature name='3dnowext'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </blockers>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </mode>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </cpu>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <memoryBacking supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <enum name='sourceType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>anonymous</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <value>memfd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </memoryBacking>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <disk supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='diskDevice'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>disk</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cdrom</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>floppy</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>lun</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>fdc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>sata</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </disk>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <graphics supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vnc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egl-headless</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </graphics>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <video supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='modelType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vga</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>cirrus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>none</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>bochs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ramfb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </video>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hostdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='mode'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>subsystem</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='startupPolicy'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>mandatory</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>requisite</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>optional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='subsysType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pci</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>scsi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='capsType'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='pciBackend'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hostdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <rng supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtio-non-transitional</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>random</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>egd</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </rng>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <filesystem supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='driverType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>path</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>handle</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>virtiofs</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </filesystem>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <tpm supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-tis</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tpm-crb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emulator</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>external</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendVersion'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>2.0</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </tpm>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <redirdev supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='bus'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>usb</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </redirdev>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <channel supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </channel>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <crypto supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendModel'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>builtin</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </crypto>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <interface supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='backendType'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>default</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>passt</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </interface>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <panic supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='model'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>isa</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>hyperv</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </panic>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <console supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='type'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>null</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vc</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pty</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dev</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>file</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>pipe</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stdio</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>udp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tcp</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>unix</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>qemu-vdagent</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>dbus</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </console>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </devices>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   <features>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <gic supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <vmcoreinfo supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <genid supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backingStoreInput supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <backup supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <async-teardown supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <ps2 supported='yes'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sev supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <sgx supported='no'/>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <hyperv supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='features'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>relaxed</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vapic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>spinlocks</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vpindex</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>runtime</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>synic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>stimer</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reset</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>vendor_id</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>frequencies</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>reenlightenment</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tlbflush</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>ipi</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>avic</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>emsr_bitmap</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>xmm_input</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <spinlocks>4095</spinlocks>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <stimer_direct>on</stimer_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_direct>on</tlbflush_direct>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <tlbflush_extended>on</tlbflush_extended>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </defaults>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </hyperv>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     <launchSecurity supported='yes'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       <enum name='sectype'>
Dec 01 10:09:46 compute-2 nova_compute[230216]:         <value>tdx</value>
Dec 01 10:09:46 compute-2 nova_compute[230216]:       </enum>
Dec 01 10:09:46 compute-2 nova_compute[230216]:     </launchSecurity>
Dec 01 10:09:46 compute-2 nova_compute[230216]:   </features>
Dec 01 10:09:46 compute-2 nova_compute[230216]: </domainCapabilities>
Dec 01 10:09:46 compute-2 nova_compute[230216]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.227 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.227 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.228 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.228 230220 INFO nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Secure Boot support detected
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.230 230220 DEBUG nova.virt.libvirt.volume.mount [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.233 230220 INFO nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.233 230220 INFO nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.246 230220 DEBUG nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.272 230220 INFO nova.virt.node [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Determined node identity 801130cb-2e08-4a6f-b53c-1300fad37b0c from /var/lib/nova/compute_id
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.289 230220 WARNING nova.compute.manager [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Compute nodes ['801130cb-2e08-4a6f-b53c-1300fad37b0c'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.313 230220 INFO nova.compute.manager [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.338 230220 WARNING nova.compute.manager [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.338 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.339 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.339 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.339 230220 DEBUG nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.340 230220 DEBUG oslo_concurrency.processutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:09:46 compute-2 rsyslogd[1007]: imjournal from <np0005540827:nova_compute>: begin to drop messages due to rate-limiting
Dec 01 10:09:46 compute-2 ceph-mon[76053]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:46 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7100016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:09:46 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1518927992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.798 230220 DEBUG oslo_concurrency.processutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.993 230220 WARNING nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.994 230220 DEBUG nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5233MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.994 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:09:46 compute-2 nova_compute[230216]: 2025-12-01 10:09:46.995 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:09:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:46 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:47 compute-2 nova_compute[230216]: 2025-12-01 10:09:47.030 230220 WARNING nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] No compute node record for compute-2.ctlplane.example.com:801130cb-2e08-4a6f-b53c-1300fad37b0c: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 801130cb-2e08-4a6f-b53c-1300fad37b0c could not be found.
Dec 01 10:09:47 compute-2 nova_compute[230216]: 2025-12-01 10:09:47.072 230220 INFO nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 801130cb-2e08-4a6f-b53c-1300fad37b0c
Dec 01 10:09:47 compute-2 nova_compute[230216]: 2025-12-01 10:09:47.144 230220 DEBUG nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:09:47 compute-2 nova_compute[230216]: 2025-12-01 10:09:47.144 230220 DEBUG nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:09:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1518927992' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3980366206' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:09:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:47.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:09:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:48 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.389 230220 INFO nova.scheduler.client.report [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] [req-e2b93a04-9294-47c4-b8c2-7e8d39a87a1f] Created resource provider record via placement API for resource provider with UUID 801130cb-2e08-4a6f-b53c-1300fad37b0c and name compute-2.ctlplane.example.com.
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.488 230220 DEBUG oslo_concurrency.processutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:09:48 compute-2 ceph-mon[76053]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2407314549' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:48 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:09:48 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1168202625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.949 230220 DEBUG oslo_concurrency.processutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.954 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 01 10:09:48 compute-2 nova_compute[230216]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.954 230220 INFO nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] kernel doesn't support AMD SEV
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.955 230220 DEBUG nova.compute.provider_tree [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:09:48 compute-2 nova_compute[230216]: 2025-12-01 10:09:48.956 230220 DEBUG nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 01 10:09:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:49 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.017 230220 DEBUG nova.scheduler.client.report [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Updated inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.018 230220 DEBUG nova.compute.provider_tree [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Updating resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.018 230220 DEBUG nova.compute.provider_tree [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:09:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.227 230220 DEBUG nova.compute.provider_tree [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Updating resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.264 230220 DEBUG nova.compute.resource_tracker [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.265 230220 DEBUG oslo_concurrency.lockutils [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.265 230220 DEBUG nova.service [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.343 230220 DEBUG nova.service [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 01 10:09:49 compute-2 nova_compute[230216]: 2025-12-01 10:09:49.344 230220 DEBUG nova.servicegroup.drivers.db [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 01 10:09:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:49.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3727018525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1168202625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2987823263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:09:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:09:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:49.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:09:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:50 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7100016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:50 compute-2 ceph-mon[76053]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:09:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:50 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:51 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:09:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:51.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:09:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:51.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:52 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:52 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7100016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:52 compute-2 ceph-mon[76053]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:09:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:53 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:53.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:53.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:54 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:54 compute-2 podman[230596]: 2025-12-01 10:09:54.436925246 +0000 UTC m=+0.097386189 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:09:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:54 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:09:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:54 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:54 compute-2 ceph-mon[76053]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:09:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:09:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:55 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:55.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:55.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:56 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:56 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:09:56 compute-2 ceph-mon[76053]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:09:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:09:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:09:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:57.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:09:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 5898 writes, 24K keys, 5898 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5898 writes, 1028 syncs, 5.74 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 466 writes, 729 keys, 466 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s
                                           Interval WAL: 466 writes, 228 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 10:09:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:57.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:09:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:58 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:58 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:58 compute-2 ceph-mon[76053]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:09:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:59 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec002140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:09:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:09:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:09:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:59.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:09:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:09:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:09:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:59.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:00 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:00 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:00 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:10:00 compute-2 ceph-mon[76053]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:10:00 compute-2 ceph-mon[76053]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Dec 01 10:10:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:01 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:01.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:01 compute-2 sudo[230631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:10:01 compute-2 sudo[230631]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:01 compute-2 sudo[230631]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:01.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:02 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:02 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:03 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:03 compute-2 ceph-mon[76053]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:10:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:03.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:03.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:04 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:04 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:10:04.695 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:10:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:10:04.696 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:10:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:10:04.697 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:10:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:05 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:05 compute-2 ceph-mon[76053]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:10:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:05.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:05.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101006 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:10:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:06 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:06 compute-2 ceph-mon[76053]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:10:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:06 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:07 compute-2 kernel: ganesha.nfsd[230252]: segfault at 50 ip 00007ff7cf17632e sp 00007ff793ffe210 error 4 in libntirpc.so.5.8[7ff7cf15b000+2c000] likely on CPU 2 (core 0, socket 2)
Dec 01 10:10:07 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:10:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:10:07 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec0032f0 fd 39 proxy ignored for local
Dec 01 10:10:07 compute-2 systemd[1]: Started Process Core Dump (PID 230660/UID 0).
Dec 01 10:10:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:07.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:07.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:08 compute-2 ceph-mon[76053]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:10:08 compute-2 systemd-coredump[230661]: Process 224751 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007ff7cf17632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:10:08 compute-2 systemd[1]: systemd-coredump@13-230660-0.service: Deactivated successfully.
Dec 01 10:10:08 compute-2 systemd[1]: systemd-coredump@13-230660-0.service: Consumed 1.465s CPU time.
Dec 01 10:10:08 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:10:08 compute-2 podman[230669]: 2025-12-01 10:10:08.667345679 +0000 UTC m=+0.028374055 container died 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Dec 01 10:10:08 compute-2 systemd[1]: var-lib-containers-storage-overlay-9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb-merged.mount: Deactivated successfully.
Dec 01 10:10:08 compute-2 podman[230669]: 2025-12-01 10:10:08.728143349 +0000 UTC m=+0.089171715 container remove 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:10:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:10:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:10:08 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.622s CPU time.
Dec 01 10:10:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:10:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:09.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:10:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:09.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:10 compute-2 podman[230714]: 2025-12-01 10:10:10.391145689 +0000 UTC m=+0.052204997 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 01 10:10:10 compute-2 ceph-mon[76053]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:10:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:10:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:11.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:11.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:12 compute-2 ceph-mon[76053]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:10:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101012 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:10:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101013 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:10:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:13.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:13.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:14 compute-2 ceph-mon[76053]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:10:15 compute-2 sudo[230737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:10:15 compute-2 sudo[230737]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:15 compute-2 sudo[230737]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:15 compute-2 sudo[230762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:10:15 compute-2 sudo[230762]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:15 compute-2 podman[230786]: 2025-12-01 10:10:15.196456507 +0000 UTC m=+0.084093278 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 01 10:10:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:15.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:15 compute-2 sudo[230762]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:16 compute-2 rsyslogd[1007]: imjournal: 3284 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 01 10:10:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:16 compute-2 ceph-mon[76053]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:10:16 compute-2 ceph-mon[76053]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 193 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:10:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:10:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:17.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:17.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1630024889' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:10:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1630024889' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:10:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/85141022' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:10:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/85141022' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:10:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:18 compute-2 ceph-mon[76053]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 B/s rd, 0 op/s
Dec 01 10:10:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3543658280' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:10:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3543658280' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:10:19 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 14.
Dec 01 10:10:19 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:10:19 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.622s CPU time.
Dec 01 10:10:19 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:19 compute-2 podman[230889]: 2025-12-01 10:10:19.288834976 +0000 UTC m=+0.024960061 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:10:19 compute-2 podman[230889]: 2025-12-01 10:10:19.554299467 +0000 UTC m=+0.290424522 container create 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:10:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:19.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:10:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:10:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:10:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:10:19 compute-2 podman[230889]: 2025-12-01 10:10:19.609546257 +0000 UTC m=+0.345671332 container init 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True)
Dec 01 10:10:19 compute-2 podman[230889]: 2025-12-01 10:10:19.614931101 +0000 UTC m=+0.351056156 container start 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec 01 10:10:19 compute-2 bash[230889]: 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e
Dec 01 10:10:19 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:10:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:10:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:19.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:20 compute-2 ceph-mon[76053]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 B/s rd, 0 op/s
Dec 01 10:10:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:21.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:21 compute-2 sudo[230949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:10:21 compute-2 sudo[230949]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:21 compute-2 sudo[230949]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:21.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:23 compute-2 sudo[230976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:10:23 compute-2 sudo[230976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:23 compute-2 sudo[230976]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:23.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:23 compute-2 ceph-mon[76053]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 872 B/s rd, 193 B/s wr, 1 op/s
Dec 01 10:10:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:10:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:23.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:10:24 compute-2 ceph-mon[76053]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 873 B/s rd, 194 B/s wr, 1 op/s
Dec 01 10:10:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:10:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:25 compute-2 podman[231003]: 2025-12-01 10:10:25.431383998 +0000 UTC m=+0.092051746 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:10:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:25.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:25 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:10:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:25 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:10:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:25.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:26 compute-2 ceph-mon[76053]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 776 B/s wr, 3 op/s
Dec 01 10:10:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:27.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:27.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:29 compute-2 ceph-mon[76053]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 682 B/s wr, 2 op/s
Dec 01 10:10:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:29.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:29.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:31 compute-2 ceph-mon[76053]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 682 B/s wr, 2 op/s
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:31.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:31.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:10:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:10:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:32 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe18000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:32 compute-2 ceph-mon[76053]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:10:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:32 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:33 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:33.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:33.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:34 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:34 compute-2 sshd-session[231033]: Received disconnect from 45.78.219.119 port 42880:11: Bye Bye [preauth]
Dec 01 10:10:34 compute-2 sshd-session[231033]: Disconnected from authenticating user root 45.78.219.119 port 42880 [preauth]
Dec 01 10:10:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:34 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101034 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:10:34 compute-2 ceph-mon[76053]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:10:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101035 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:10:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:35 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:35.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:35.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:36 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:36 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:36 compute-2 ceph-mon[76053]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:10:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:37 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:37.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:37.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:38 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:38 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:39 compute-2 ceph-mon[76053]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:10:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:39 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:39.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:10:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:40 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:40 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:41 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:41 compute-2 ceph-mon[76053]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:10:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:41 compute-2 podman[231062]: 2025-12-01 10:10:41.424725029 +0000 UTC m=+0.066146703 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 01 10:10:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:41.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:41.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:41 compute-2 sudo[231082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:10:41 compute-2 sudo[231082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:10:41 compute-2 sudo[231082]: pam_unix(sudo:session): session closed for user root
Dec 01 10:10:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:42 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:42 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:43 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:43 compute-2 ceph-mon[76053]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Dec 01 10:10:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:43.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:44 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:44 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:45 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:45 compute-2 ceph-mon[76053]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:10:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:45 compute-2 podman[231111]: 2025-12-01 10:10:45.392486264 +0000 UTC m=+0.053607432 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 10:10:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:45.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:46 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.346 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.346 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.347 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.347 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.436 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.436 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.437 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.467 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.468 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.468 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.469 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.470 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:10:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:46 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:10:46 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4046889273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:46 compute-2 nova_compute[230216]: 2025-12-01 10:10:46.932 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:10:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:47 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.090 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:10:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:47 compute-2 ceph-mon[76053]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:10:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3591721800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3007495121' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4046889273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2882081013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.197 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.198 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:10:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:47.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:10:47 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3366574243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.665 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.672 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.702 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.704 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.704 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:10:47 compute-2 nova_compute[230216]: 2025-12-01 10:10:47.705 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:10:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:47.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:48 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3648943861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3366574243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:10:48 compute-2 ceph-mon[76053]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:10:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:48 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:49 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:49.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:50 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:50 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:51 compute-2 ceph-mon[76053]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:10:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:51 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:51.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:51.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:52 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:52 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:53 compute-2 ceph-mon[76053]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:10:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:53 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:53.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:53.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:54 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:54 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:55 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:55 compute-2 ceph-mon[76053]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:10:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:10:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:55.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:10:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:56 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:56 compute-2 podman[231186]: 2025-12-01 10:10:56.420925132 +0000 UTC m=+0.079024673 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 01 10:10:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:56 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:10:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:57 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:57 compute-2 ceph-mon[76053]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:10:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec 01 10:10:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:57.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:57.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:58 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:58 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:59 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:10:59 compute-2 ceph-mon[76053]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:10:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:10:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:10:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:59.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:10:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:10:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:10:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:59.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:00 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:00 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:01 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:01 compute-2 ceph-mon[76053]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:11:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:01.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:01 compute-2 sudo[231218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:11:01 compute-2 sudo[231218]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:01 compute-2 sudo[231218]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:02 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:02 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:03 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:03 compute-2 ceph-mon[76053]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:03.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:04 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.697 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:11:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:11:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:11:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:04 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:05 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:05 compute-2 ceph-mon[76053]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:05.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:05.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:06 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:06 compute-2 ceph-mon[76053]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:06 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:07 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:07.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:07.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:08 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:08 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:08 compute-2 ceph-mon[76053]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:09 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:09.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:10 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:11:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:10 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:11 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:11 compute-2 ceph-mon[76053]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:11.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:11.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:12 compute-2 ceph-mon[76053]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 01 10:11:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:12 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:12 compute-2 podman[231255]: 2025-12-01 10:11:12.404521185 +0000 UTC m=+0.054800751 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 01 10:11:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:12 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:13 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:11:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:13.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:11:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:13.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:14 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:14 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:14 compute-2 ceph-mon[76053]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:11:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:15 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:15.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:16 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:16 compute-2 podman[231280]: 2025-12-01 10:11:16.399172338 +0000 UTC m=+0.059486417 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:11:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:16 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:17 compute-2 kernel: ganesha.nfsd[231040]: segfault at 50 ip 00007fbec7acf32e sp 00007fbe91ffa210 error 4 in libntirpc.so.5.8[7fbec7ab4000+2c000] likely on CPU 1 (core 0, socket 1)
Dec 01 10:11:17 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:11:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:17 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy ignored for local
Dec 01 10:11:17 compute-2 systemd[1]: Started Process Core Dump (PID 231301/UID 0).
Dec 01 10:11:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:17 compute-2 ceph-mon[76053]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:11:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:17.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:17.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:18 compute-2 systemd-coredump[231302]: Process 230909 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007fbec7acf32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:11:18 compute-2 systemd[1]: systemd-coredump@14-231301-0.service: Deactivated successfully.
Dec 01 10:11:18 compute-2 systemd[1]: systemd-coredump@14-231301-0.service: Consumed 1.437s CPU time.
Dec 01 10:11:18 compute-2 podman[231309]: 2025-12-01 10:11:18.624186881 +0000 UTC m=+0.027194456 container died 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:11:18 compute-2 systemd[1]: var-lib-containers-storage-overlay-958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d-merged.mount: Deactivated successfully.
Dec 01 10:11:18 compute-2 podman[231309]: 2025-12-01 10:11:18.668338247 +0000 UTC m=+0.071345792 container remove 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:11:18 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:11:18 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:11:18 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.705s CPU time.
Dec 01 10:11:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:19 compute-2 ceph-mon[76053]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:11:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:19.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:19.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:20 compute-2 ceph-mon[76053]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:11:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:21.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2861615114' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 10:11:21 compute-2 ceph-mon[76053]: from='client.14964 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 10:11:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3098294709' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 10:11:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:21.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:22 compute-2 sudo[231357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:11:22 compute-2 sudo[231357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:22 compute-2 sudo[231357]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101123 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:11:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:23.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:23 compute-2 sudo[231384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:11:23 compute-2 sudo[231384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:23 compute-2 sudo[231384]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:23 compute-2 sudo[231409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:11:23 compute-2 sudo[231409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:24 compute-2 sudo[231409]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:24 compute-2 ceph-mon[76053]: from='client.24515 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 10:11:24 compute-2 ceph-mon[76053]: from='client.24515 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Dec 01 10:11:24 compute-2 ceph-mon[76053]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:11:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:25.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:25.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:25 compute-2 ceph-mon[76053]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:11:25 compute-2 ceph-mon[76053]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 292 B/s rd, 0 op/s
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:11:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:11:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:27 compute-2 podman[231469]: 2025-12-01 10:11:27.467543337 +0000 UTC m=+0.119005185 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Dec 01 10:11:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:27.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:27.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:27 compute-2 ceph-mon[76053]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 194 B/s rd, 0 op/s
Dec 01 10:11:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101128 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:11:29 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 15.
Dec 01 10:11:29 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:11:29 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.705s CPU time.
Dec 01 10:11:29 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:29 compute-2 podman[231546]: 2025-12-01 10:11:29.290345896 +0000 UTC m=+0.046639389 container create 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec 01 10:11:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:11:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:11:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:11:29 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:11:29 compute-2 podman[231546]: 2025-12-01 10:11:29.359144673 +0000 UTC m=+0.115438196 container init 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 10:11:29 compute-2 podman[231546]: 2025-12-01 10:11:29.266627417 +0000 UTC m=+0.022920940 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:11:29 compute-2 podman[231546]: 2025-12-01 10:11:29.364670741 +0000 UTC m=+0.120964234 container start 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:11:29 compute-2 bash[231546]: 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:11:29 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:11:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:29.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:29.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:29 compute-2 ceph-mon[76053]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 194 B/s rd, 0 op/s
Dec 01 10:11:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:31 compute-2 sudo[231605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:11:31 compute-2 sudo[231605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:31 compute-2 sudo[231605]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:31.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:31 compute-2 ceph-mon[76053]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 194 B/s rd, 0 op/s
Dec 01 10:11:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:11:31 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:11:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:33.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:33 compute-2 ceph-mon[76053]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 584 B/s rd, 97 B/s wr, 0 op/s
Dec 01 10:11:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:33.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 01 10:11:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:35.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:35 compute-2 ceph-mon[76053]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 584 B/s rd, 97 B/s wr, 0 op/s
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:11:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:11:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:35.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:37.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:37 compute-2 ceph-mon[76053]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:11:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:37.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:39.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:39 compute-2 ceph-mon[76053]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:11:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:11:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:39.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:11:41 compute-2 ceph-mon[76053]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:11:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:11:42 compute-2 sudo[231654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:11:42 compute-2 sudo[231654]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:11:42 compute-2 sudo[231654]: pam_unix(sudo:session): session closed for user root
Dec 01 10:11:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:42 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0094000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:42 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:43 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:43 compute-2 podman[231685]: 2025-12-01 10:11:43.397343266 +0000 UTC m=+0.056187677 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 10:11:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:43.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:43 compute-2 ceph-mon[76053]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:44 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec 01 10:11:44 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2895286954' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:11:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:11:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1575990772' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 10:11:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2895286954' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 01 10:11:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101145 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:11:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:45 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:45 compute-2 nova_compute[230216]: 2025-12-01 10:11:45.561 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:45.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:45 compute-2 ceph-mon[76053]: from='client.24533 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 10:11:45 compute-2 ceph-mon[76053]: from='client.24533 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Dec 01 10:11:45 compute-2 ceph-mon[76053]: from='client.24670 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 01 10:11:45 compute-2 ceph-mon[76053]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:11:45 compute-2 nova_compute[230216]: 2025-12-01 10:11:45.896 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:45 compute-2 nova_compute[230216]: 2025-12-01 10:11:45.897 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:11:45 compute-2 nova_compute[230216]: 2025-12-01 10:11:45.897 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:11:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.203 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.204 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.205 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.205 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.242 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.242 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:11:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:46 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:11:46 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/285200310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:46 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.752 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:11:46 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/285200310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.905 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.906 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.907 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.907 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.986 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:11:46 compute-2 nova_compute[230216]: 2025-12-01 10:11:46.987 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.008 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:11:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:47 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:47 compute-2 podman[231750]: 2025-12-01 10:11:47.401492204 +0000 UTC m=+0.060431101 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 01 10:11:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:11:47 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3554242871' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.479 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.485 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.509 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.511 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.511 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.513 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.514 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.514 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:47 compute-2 nova_compute[230216]: 2025-12-01 10:11:47.515 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:11:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:47.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:11:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:11:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:47 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:11:47 compute-2 ceph-mon[76053]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 1.7 KiB/s wr, 6 op/s
Dec 01 10:11:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3554242871' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2014530076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:47 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1106534863' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:48 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:48 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/977174623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:49 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:49.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:49.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:49 compute-2 ceph-mon[76053]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 01 10:11:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/649293739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:11:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:50 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:50 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101150 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:11:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:51 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:51.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:11:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:51.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:11:51 compute-2 ceph-mon[76053]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 01 10:11:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:52 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:52 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:53 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:53.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:53.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:53 compute-2 ceph-mon[76053]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 01 10:11:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:54 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:54 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:11:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:55 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:11:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:55.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:11:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:55.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:56 compute-2 ceph-mon[76053]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:11:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:56 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:56 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:57 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:11:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:58 compute-2 ceph-mon[76053]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 01 10:11:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:58 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:58 compute-2 podman[231782]: 2025-12-01 10:11:58.425855044 +0000 UTC m=+0.083418381 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 01 10:11:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:58 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:59 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:11:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:11:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:11:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:11:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:11:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:59.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:00 compute-2 ceph-mon[76053]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:12:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:00 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:00 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:01 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:01.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:01.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:02 compute-2 ceph-mon[76053]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:12:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:02 compute-2 sudo[231812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:12:02 compute-2 sudo[231812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:02 compute-2 sudo[231812]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:02 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:02 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:03 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:03.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:03.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:04 compute-2 ceph-mon[76053]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 01 10:12:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:04 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:12:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:12:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:12:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:04 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:05 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:05 compute-2 ceph-mon[76053]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:05.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:05.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:06 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:06 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:12:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:12:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:12:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:12:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:07 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:07 compute-2 ceph-mon[76053]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:12:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:12:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:07.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:07.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:08 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:08 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:09 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:09 compute-2 ceph-mon[76053]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:09.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:10 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:12:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:10 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:11 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:11 compute-2 ceph-mon[76053]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:11.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:12 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:12 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:13 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:13 compute-2 ceph-mon[76053]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:12:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:13.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:13.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:14 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:14 compute-2 podman[231851]: 2025-12-01 10:12:14.394477035 +0000 UTC m=+0.053480071 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:12:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:14 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:15 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:15 compute-2 ceph-mon[76053]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:15.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:15.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:16 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:16 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:17 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:17 compute-2 ceph-mon[76053]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:17.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:17.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:18 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0094000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:18 compute-2 podman[231874]: 2025-12-01 10:12:18.397445001 +0000 UTC m=+0.058475240 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:12:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:18 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:19 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:19 compute-2 ceph-mon[76053]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:19.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:20 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:20 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00940020a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:21 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:21 compute-2 ceph-mon[76053]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:21.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:22 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:22 compute-2 sudo[231898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:12:22 compute-2 sudo[231898]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:22 compute-2 sudo[231898]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:22 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:23 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00940020a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:23 compute-2 ceph-mon[76053]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:12:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:23.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:23.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:24 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:12:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:24 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:25 compute-2 kernel: ganesha.nfsd[231647]: segfault at 50 ip 00007f014441332e sp 00007f010d7f9210 error 4 in libntirpc.so.5.8[7f01443f8000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 01 10:12:25 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:12:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:25 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy ignored for local
Dec 01 10:12:25 compute-2 systemd[1]: Started Process Core Dump (PID 231925/UID 0).
Dec 01 10:12:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:25 compute-2 ceph-mon[76053]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:25.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:26 compute-2 systemd-coredump[231926]: Process 231567 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007f014441332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:12:26 compute-2 systemd[1]: systemd-coredump@15-231925-0.service: Deactivated successfully.
Dec 01 10:12:26 compute-2 systemd[1]: systemd-coredump@15-231925-0.service: Consumed 1.400s CPU time.
Dec 01 10:12:26 compute-2 podman[231933]: 2025-12-01 10:12:26.616268789 +0000 UTC m=+0.027105875 container died 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 10:12:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:26 compute-2 systemd[1]: var-lib-containers-storage-overlay-b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1-merged.mount: Deactivated successfully.
Dec 01 10:12:26 compute-2 podman[231933]: 2025-12-01 10:12:26.649005747 +0000 UTC m=+0.059842813 container remove 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 10:12:26 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:12:26 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:12:26 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.650s CPU time.
Dec 01 10:12:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:27 compute-2 ceph-mon[76053]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:27.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:27.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:29 compute-2 podman[231981]: 2025-12-01 10:12:29.427722354 +0000 UTC m=+0.082928250 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:12:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:29.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:29 compute-2 ceph-mon[76053]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:12:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:12:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101231 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:12:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:31 compute-2 sudo[232008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:12:31 compute-2 sudo[232008]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:31 compute-2 sudo[232008]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:31 compute-2 sudo[232033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:12:31 compute-2 sudo[232033]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:31.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:31 compute-2 sudo[232033]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:31 compute-2 ceph-mon[76053]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:12:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:31.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:12:32 compute-2 ceph-mon[76053]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 440 B/s rd, 0 op/s
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:12:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:12:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:33.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101235 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:12:35 compute-2 ceph-mon[76053]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 175 B/s rd, 0 op/s
Dec 01 10:12:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:35 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.668 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:12:35 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.669 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:12:35 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.671 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:12:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:35.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:35.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:36 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 16.
Dec 01 10:12:36 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:12:36 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.650s CPU time.
Dec 01 10:12:36 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:12:37 compute-2 podman[232139]: 2025-12-01 10:12:37.055902622 +0000 UTC m=+0.045369679 container create cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:12:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:12:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:12:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:12:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:12:37 compute-2 podman[232139]: 2025-12-01 10:12:37.118914168 +0000 UTC m=+0.108381255 container init cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 10:12:37 compute-2 podman[232139]: 2025-12-01 10:12:37.128022785 +0000 UTC m=+0.117489842 container start cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 01 10:12:37 compute-2 podman[232139]: 2025-12-01 10:12:37.037208908 +0000 UTC m=+0.026675995 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:12:37 compute-2 bash[232139]: cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b
Dec 01 10:12:37 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:12:37 compute-2 ceph-mon[76053]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 175 B/s rd, 0 op/s
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:12:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:37.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:38 compute-2 sudo[232199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:12:38 compute-2 sudo[232199]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:38 compute-2 sudo[232199]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:39 compute-2 ceph-mon[76053]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 175 B/s rd, 0 op/s
Dec 01 10:12:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:12:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:12:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:39.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:39.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:12:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:41 compute-2 ceph-mon[76053]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 176 B/s rd, 0 op/s
Dec 01 10:12:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:41.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000046s ======
Dec 01 10:12:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:41.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000046s
Dec 01 10:12:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:42 compute-2 sudo[232229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:12:42 compute-2 sudo[232229]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:12:42 compute-2 sudo[232229]: pam_unix(sudo:session): session closed for user root
Dec 01 10:12:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:43 compute-2 ceph-mon[76053]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 968 B/s rd, 440 B/s wr, 1 op/s
Dec 01 10:12:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:12:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:12:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:43.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:44 compute-2 nova_compute[230216]: 2025-12-01 10:12:44.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:44 compute-2 ceph-mon[76053]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Dec 01 10:12:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:45 compute-2 nova_compute[230216]: 2025-12-01 10:12:45.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:45 compute-2 nova_compute[230216]: 2025-12-01 10:12:45.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:12:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:45 compute-2 podman[232258]: 2025-12-01 10:12:45.406159939 +0000 UTC m=+0.063754315 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 01 10:12:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:45.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:45.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:46 compute-2 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:47 compute-2 ceph-mon[76053]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.228 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:12:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:12:47 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3283572621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.683 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:12:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:47.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.848 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.849 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5239MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.849 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.850 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:12:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:47.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.916 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.917 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:12:47 compute-2 nova_compute[230216]: 2025-12-01 10:12:47.931 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:12:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1953192487' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3283572621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:12:48 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2611858527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:48 compute-2 nova_compute[230216]: 2025-12-01 10:12:48.369 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:12:48 compute-2 nova_compute[230216]: 2025-12-01 10:12:48.374 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:12:48 compute-2 nova_compute[230216]: 2025-12-01 10:12:48.391 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:12:48 compute-2 nova_compute[230216]: 2025-12-01 10:12:48.392 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:12:48 compute-2 nova_compute[230216]: 2025-12-01 10:12:48.393 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:12:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:49 compute-2 ceph-mon[76053]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:12:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3555826676' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2611858527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:49 compute-2 nova_compute[230216]: 2025-12-01 10:12:49.387 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:12:49 compute-2 podman[232326]: 2025-12-01 10:12:49.40720489 +0000 UTC m=+0.057412835 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:12:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:49.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:49.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1068358381' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:50 compute-2 ceph-mon[76053]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 01 10:12:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4460001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:51 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2824092753' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:12:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:52 compute-2 ceph-mon[76053]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.5 KiB/s wr, 4 op/s
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101253 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:12:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:53 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:53.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.179194) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974179299, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2368, "num_deletes": 251, "total_data_size": 6458985, "memory_usage": 6542336, "flush_reason": "Manual Compaction"}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974222502, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4178025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20813, "largest_seqno": 23176, "table_properties": {"data_size": 4168505, "index_size": 6014, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19487, "raw_average_key_size": 20, "raw_value_size": 4149481, "raw_average_value_size": 4286, "num_data_blocks": 265, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583754, "oldest_key_time": 1764583754, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 43420 microseconds, and 8679 cpu microseconds.
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.222577) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4178025 bytes OK
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.222643) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224418) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224439) EVENT_LOG_v1 {"time_micros": 1764583974224432, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224465) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6448578, prev total WAL file size 6448578, number of live WAL files 2.
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:12:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.226258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4080KB)], [39(12MB)]
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974226342, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17789030, "oldest_snapshot_seqno": -1}
Dec 01 10:12:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5484 keys, 15620022 bytes, temperature: kUnknown
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974525028, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15620022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15580618, "index_size": 24574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138159, "raw_average_key_size": 25, "raw_value_size": 15478473, "raw_average_value_size": 2822, "num_data_blocks": 1016, "num_entries": 5484, "num_filter_entries": 5484, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.525291) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15620022 bytes
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.530793) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 59.5 rd, 52.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 13.0 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 6000, records dropped: 516 output_compression: NoCompression
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.530829) EVENT_LOG_v1 {"time_micros": 1764583974530815, "job": 22, "event": "compaction_finished", "compaction_time_micros": 298773, "compaction_time_cpu_micros": 36477, "output_level": 6, "num_output_files": 1, "total_output_size": 15620022, "num_input_records": 6000, "num_output_records": 5484, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974531617, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974535032, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.226128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535197) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:12:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:55 compute-2 ceph-mon[76053]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:12:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:12:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:12:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:55.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:12:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:12:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:56 compute-2 ceph-mon[76053]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 01 10:12:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:57 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:12:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:57.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101259 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:12:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:59 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:12:59 compute-2 ceph-mon[76053]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Dec 01 10:12:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:12:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:59.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:12:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:12:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:12:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:59.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:00 compute-2 podman[232372]: 2025-12-01 10:13:00.430474445 +0000 UTC m=+0.087521100 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 01 10:13:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:01 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:01 compute-2 ceph-mon[76053]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Dec 01 10:13:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:01.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:01.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:02 compute-2 ceph-mon[76053]: pgmap v663: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Dec 01 10:13:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:02 compute-2 sudo[232401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:13:02 compute-2 sudo[232401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:02 compute-2 sudo[232401]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:03 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:03.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:13:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:13:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:13:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101305 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:13:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:05 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:05 compute-2 ceph-mon[76053]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 01 10:13:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:05.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:06 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:06 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:07 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:07 compute-2 ceph-mon[76053]: pgmap v665: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Dec 01 10:13:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3483028568' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:13:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3483028568' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:13:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:07.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:07.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:08 compute-2 ceph-mon[76053]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:13:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:08 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:08 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:09 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:13:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:09.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:09.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:10 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:10 compute-2 ceph-mon[76053]: pgmap v667: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:13:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:10 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:11 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:11.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:11.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:12 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:12 compute-2 ceph-mon[76053]: pgmap v668: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:13:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:12 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:13 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:13.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:13:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:15 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:15 compute-2 ceph-mon[76053]: pgmap v669: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:13:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:15.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:16 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:16 compute-2 podman[232442]: 2025-12-01 10:13:16.428674091 +0000 UTC m=+0.086619748 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 01 10:13:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:16 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:17 compute-2 ceph-mon[76053]: pgmap v670: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:13:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:13:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:13:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:17.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:17.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:18 compute-2 ceph-mon[76053]: pgmap v671: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:13:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:18 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:18 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:19 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:19.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.282454) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000282555, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 481, "num_deletes": 252, "total_data_size": 711696, "memory_usage": 720192, "flush_reason": "Manual Compaction"}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000286493, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 351224, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23181, "largest_seqno": 23657, "table_properties": {"data_size": 348788, "index_size": 536, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6335, "raw_average_key_size": 19, "raw_value_size": 343884, "raw_average_value_size": 1058, "num_data_blocks": 24, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583975, "oldest_key_time": 1764583975, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4142 microseconds, and 1740 cpu microseconds.
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.286578) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 351224 bytes OK
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.286630) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288629) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288670) EVENT_LOG_v1 {"time_micros": 1764584000288647, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288692) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 708797, prev total WAL file size 708797, number of live WAL files 2.
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289380) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(342KB)], [42(14MB)]
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000289496, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15971246, "oldest_snapshot_seqno": -1}
Dec 01 10:13:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:20 compute-2 podman[232465]: 2025-12-01 10:13:20.42715961 +0000 UTC m=+0.054747571 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5304 keys, 11953367 bytes, temperature: kUnknown
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000428564, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 11953367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11919500, "index_size": 19485, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 134831, "raw_average_key_size": 25, "raw_value_size": 11824840, "raw_average_value_size": 2229, "num_data_blocks": 793, "num_entries": 5304, "num_filter_entries": 5304, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.428852) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 11953367 bytes
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.433574) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.8 rd, 85.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.9 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(79.5) write-amplify(34.0) OK, records in: 5809, records dropped: 505 output_compression: NoCompression
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.433663) EVENT_LOG_v1 {"time_micros": 1764584000433643, "job": 24, "event": "compaction_finished", "compaction_time_micros": 139171, "compaction_time_cpu_micros": 49448, "output_level": 6, "num_output_files": 1, "total_output_size": 11953367, "num_input_records": 5809, "num_output_records": 5304, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000434262, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000437646, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437769) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:13:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:13:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:21 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:21 compute-2 ceph-mon[76053]: pgmap v672: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:13:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:21 compute-2 sshd-session[232436]: Received disconnect from 45.78.219.119 port 37308:11: Bye Bye [preauth]
Dec 01 10:13:21 compute-2 sshd-session[232436]: Disconnected from 45.78.219.119 port 37308 [preauth]
Dec 01 10:13:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:21.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:22 compute-2 ceph-mon[76053]: pgmap v673: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:13:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:22 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:22 compute-2 sudo[232489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:13:22 compute-2 sudo[232489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:22 compute-2 sudo[232489]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:22 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:23 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:23.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:24 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:24 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454001020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:25 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:25 compute-2 ceph-mon[76053]: pgmap v674: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:13:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:13:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:25.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:26 compute-2 ceph-mon[76053]: pgmap v675: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:13:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:26 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:26 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101327 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 3ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:13:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:27 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:27.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:13:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:13:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:28 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:28 compute-2 ceph-mon[76053]: pgmap v676: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:13:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:28 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:29 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:29.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:29.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:30 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:30 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:31 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:31 compute-2 ceph-mon[76053]: pgmap v677: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:13:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:31 compute-2 podman[232524]: 2025-12-01 10:13:31.455574648 +0000 UTC m=+0.119552850 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:13:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:31.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:32 compute-2 ceph-mon[76053]: pgmap v678: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:13:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:32 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:32 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:33 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:33.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:33.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:34 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:34 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:35 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:35 compute-2 ceph-mon[76053]: pgmap v679: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:13:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:35.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:36 compute-2 ceph-mon[76053]: pgmap v680: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:13:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:36 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:36 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:37.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:38 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:38 compute-2 sudo[232555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:13:38 compute-2 sudo[232555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:38 compute-2 sudo[232555]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:38 compute-2 sudo[232580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 10:13:38 compute-2 sudo[232580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:38 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:39 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:39 compute-2 podman[232677]: 2025-12-01 10:13:39.167791262 +0000 UTC m=+0.061493791 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec 01 10:13:39 compute-2 ceph-mon[76053]: pgmap v681: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:13:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:39 compute-2 podman[232677]: 2025-12-01 10:13:39.269921498 +0000 UTC m=+0.163624007 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:13:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:39 compute-2 podman[232796]: 2025-12-01 10:13:39.7676387 +0000 UTC m=+0.078066616 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:13:39 compute-2 podman[232796]: 2025-12-01 10:13:39.804037234 +0000 UTC m=+0.114465160 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:13:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:39.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:39.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:40 compute-2 podman[232888]: 2025-12-01 10:13:40.128760727 +0000 UTC m=+0.059582887 container exec cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:13:40 compute-2 podman[232888]: 2025-12-01 10:13:40.143194089 +0000 UTC m=+0.074016219 container exec_died cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 10:13:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 10:13:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:13:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:40 compute-2 podman[232953]: 2025-12-01 10:13:40.360110491 +0000 UTC m=+0.051334910 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:13:40 compute-2 podman[232953]: 2025-12-01 10:13:40.3789696 +0000 UTC m=+0.070193999 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:13:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101340 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:13:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:40 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:40 compute-2 podman[233021]: 2025-12-01 10:13:40.587610405 +0000 UTC m=+0.055440748 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, com.redhat.component=keepalived-container)
Dec 01 10:13:40 compute-2 podman[233021]: 2025-12-01 10:13:40.620046366 +0000 UTC m=+0.087876699 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, name=keepalived, release=1793)
Dec 01 10:13:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:40 compute-2 sudo[232580]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:40 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:40 compute-2 sudo[233089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:13:40 compute-2 sudo[233089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:40 compute-2 sudo[233089]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:41 compute-2 sudo[233114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:13:41 compute-2 sudo[233114]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:41 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:41 compute-2 ceph-mon[76053]: pgmap v682: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:13:41 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:41 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:41 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:41 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:41 compute-2 sudo[233114]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:41.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: pgmap v683: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 268 B/s rd, 0 op/s
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:13:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:42 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:42 compute-2 sudo[233173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:13:42 compute-2 sudo[233173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:42 compute-2 sudo[233173]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:42 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:43.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:43.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:44 compute-2 nova_compute[230216]: 2025-12-01 10:13:44.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:44 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:44 compute-2 nova_compute[230216]: 2025-12-01 10:13:44.577 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:44 compute-2 ceph-mon[76053]: pgmap v684: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 268 B/s rd, 0 op/s
Dec 01 10:13:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:44 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:45 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:45 compute-2 nova_compute[230216]: 2025-12-01 10:13:45.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:45 compute-2 nova_compute[230216]: 2025-12-01 10:13:45.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:13:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:45.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:46 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:46 compute-2 ceph-mon[76053]: pgmap v685: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 268 B/s rd, 0 op/s
Dec 01 10:13:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:46 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:47 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.311 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:13:47 compute-2 podman[233205]: 2025-12-01 10:13:47.405747923 +0000 UTC m=+0.059202867 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:13:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:13:47 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3774544798' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.788 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:13:47 compute-2 sudo[233245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:13:47 compute-2 sudo[233245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:13:47 compute-2 sudo[233245]: pam_unix(sudo:session): session closed for user root
Dec 01 10:13:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:47.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.963 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.965 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5217MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.966 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:13:47 compute-2 nova_compute[230216]: 2025-12-01 10:13:47.966 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:13:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.087 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:13:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:48 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:13:48 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3125653756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.546 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.552 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.585 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.587 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:13:48 compute-2 nova_compute[230216]: 2025-12-01 10:13:48.587 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:13:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:48 compute-2 ceph-mon[76053]: pgmap v686: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 179 B/s rd, 0 op/s
Dec 01 10:13:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:13:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3774544798' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3593686561' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:48 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3125653756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:48 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.581 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.607 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.607 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:49 compute-2 nova_compute[230216]: 2025-12-01 10:13:49.608 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:13:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1748997009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3701083602' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/703416877' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:13:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:13:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:50 compute-2 ceph-mon[76053]: pgmap v687: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 179 B/s rd, 0 op/s
Dec 01 10:13:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:51 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:51 compute-2 podman[233298]: 2025-12-01 10:13:51.397714558 +0000 UTC m=+0.058574833 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:13:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:51.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:52 compute-2 ceph-mon[76053]: pgmap v688: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 627 B/s wr, 2 op/s
Dec 01 10:13:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:13:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:53 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c0008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:53.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:54 compute-2 ceph-mon[76053]: pgmap v689: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:13:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:13:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:13:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:55.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c0008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:56 compute-2 ceph-mon[76053]: pgmap v690: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Dec 01 10:13:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:57 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:13:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:57.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:13:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:58 compute-2 ceph-mon[76053]: pgmap v691: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:13:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:59 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:13:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:13:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:13:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:59.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:13:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:13:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:13:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:59.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:00 compute-2 ceph-mon[76053]: pgmap v692: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 01 10:14:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:01 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:01.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101402 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:14:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:02 compute-2 podman[233330]: 2025-12-01 10:14:02.454849998 +0000 UTC m=+0.112025032 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 10:14:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:02 compute-2 sudo[233357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:14:02 compute-2 sudo[233357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:02 compute-2 sudo[233357]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:03 compute-2 ceph-mon[76053]: pgmap v693: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 01 10:14:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:03 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:03.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:03.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:14:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:14:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:14:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:05 compute-2 ceph-mon[76053]: pgmap v694: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:14:05 compute-2 kernel: ganesha.nfsd[232348]: segfault at 50 ip 00007f450cccc32e sp 00007f44cfffe210 error 4 in libntirpc.so.5.8[7f450ccb1000+2c000] likely on CPU 6 (core 0, socket 6)
Dec 01 10:14:05 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:14:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:05 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001f90 fd 48 proxy ignored for local
Dec 01 10:14:05 compute-2 systemd[1]: Started Process Core Dump (PID 233384/UID 0).
Dec 01 10:14:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:05.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:06 compute-2 systemd-coredump[233385]: Process 232159 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 42:
                                                    #0  0x00007f450cccc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:14:06 compute-2 systemd[1]: systemd-coredump@16-233384-0.service: Deactivated successfully.
Dec 01 10:14:06 compute-2 systemd[1]: systemd-coredump@16-233384-0.service: Consumed 1.379s CPU time.
Dec 01 10:14:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:06 compute-2 podman[233392]: 2025-12-01 10:14:06.651430149 +0000 UTC m=+0.028396565 container died cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:14:06 compute-2 systemd[1]: var-lib-containers-storage-overlay-896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228-merged.mount: Deactivated successfully.
Dec 01 10:14:06 compute-2 podman[233392]: 2025-12-01 10:14:06.692282288 +0000 UTC m=+0.069248684 container remove cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 10:14:06 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:14:06 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:14:06 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.749s CPU time.
Dec 01 10:14:07 compute-2 ceph-mon[76053]: pgmap v695: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:14:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/818614316' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:14:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/818614316' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:14:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:07.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:09 compute-2 ceph-mon[76053]: pgmap v696: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 01 10:14:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:09.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:09.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:14:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101411 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:14:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:11 compute-2 ceph-mon[76053]: pgmap v697: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:14:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:11.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:13 compute-2 ceph-mon[76053]: pgmap v698: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 425 B/s rd, 85 B/s wr, 0 op/s
Dec 01 10:14:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:13.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:13.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:15 compute-2 ceph-mon[76053]: pgmap v699: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:14:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=404 latency=0.002000045s ======
Dec 01 10:14:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:15.854 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.002000045s
Dec 01 10:14:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:15.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:16.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:17 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 17.
Dec 01 10:14:17 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:14:17 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.749s CPU time.
Dec 01 10:14:17 compute-2 ceph-mon[76053]: pgmap v700: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:14:17 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:17 compute-2 podman[233489]: 2025-12-01 10:14:17.307241355 +0000 UTC m=+0.046385218 container create 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:14:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:14:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:14:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:14:17 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:14:17 compute-2 podman[233489]: 2025-12-01 10:14:17.377374687 +0000 UTC m=+0.116518570 container init 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec 01 10:14:17 compute-2 podman[233489]: 2025-12-01 10:14:17.285337032 +0000 UTC m=+0.024480925 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:14:17 compute-2 podman[233489]: 2025-12-01 10:14:17.383799285 +0000 UTC m=+0.122943148 container start 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:14:17 compute-2 bash[233489]: 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:14:17 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:14:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:17.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:18.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:18 compute-2 podman[233548]: 2025-12-01 10:14:18.401677522 +0000 UTC m=+0.057415260 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:14:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:19 compute-2 ceph-mon[76053]: pgmap v701: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:14:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:19.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:20.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec 01 10:14:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:21 compute-2 ceph-mon[76053]: pgmap v702: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:14:21 compute-2 ceph-mon[76053]: osdmap e137: 3 total, 3 up, 3 in
Dec 01 10:14:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec 01 10:14:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:21.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:22.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:22 compute-2 podman[233573]: 2025-12-01 10:14:22.400115722 +0000 UTC m=+0.055319033 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:14:22 compute-2 ceph-mon[76053]: pgmap v704: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 614 B/s wr, 1 op/s
Dec 01 10:14:22 compute-2 ceph-mon[76053]: osdmap e138: 3 total, 3 up, 3 in
Dec 01 10:14:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec 01 10:14:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:23 compute-2 sudo[233595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:14:23 compute-2 sudo[233595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:23 compute-2 sudo[233595]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:14:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:14:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 01 10:14:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:23 compute-2 ceph-mon[76053]: osdmap e139: 3 total, 3 up, 3 in
Dec 01 10:14:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:23.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:24.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:24 compute-2 ceph-mon[76053]: pgmap v707: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 1023 B/s wr, 2 op/s
Dec 01 10:14:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:14:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec 01 10:14:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:25 compute-2 ceph-mon[76053]: osdmap e140: 3 total, 3 up, 3 in
Dec 01 10:14:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:25.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:26.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101426 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:14:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:26 compute-2 ceph-mon[76053]: pgmap v709: 353 pgs: 353 active+clean; 13 MiB data, 170 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 2.3 MiB/s wr, 36 op/s
Dec 01 10:14:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:27.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 01 10:14:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:28.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:28 compute-2 ceph-mon[76053]: pgmap v710: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 45 KiB/s rd, 6.8 MiB/s wr, 64 op/s
Dec 01 10:14:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:29.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:30.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec 01 10:14:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:30 compute-2 ceph-mon[76053]: pgmap v711: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Dec 01 10:14:30 compute-2 ceph-mon[76053]: osdmap e141: 3 total, 3 up, 3 in
Dec 01 10:14:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:31.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:32.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:32 compute-2 ceph-mon[76053]: pgmap v713: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Dec 01 10:14:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:14:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:14:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:14:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:33 compute-2 podman[233633]: 2025-12-01 10:14:33.431526372 +0000 UTC m=+0.091086775 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 10:14:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:34.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:34 compute-2 ceph-mon[76053]: pgmap v714: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 4.6 MiB/s wr, 43 op/s
Dec 01 10:14:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.321811) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075321937, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1065, "num_deletes": 256, "total_data_size": 2442652, "memory_usage": 2484112, "flush_reason": "Manual Compaction"}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075332156, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1603094, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23662, "largest_seqno": 24722, "table_properties": {"data_size": 1598144, "index_size": 2474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10461, "raw_average_key_size": 19, "raw_value_size": 1588131, "raw_average_value_size": 2914, "num_data_blocks": 108, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584001, "oldest_key_time": 1764584001, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10389 microseconds, and 4792 cpu microseconds.
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.332217) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1603094 bytes OK
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.332238) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333418) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333433) EVENT_LOG_v1 {"time_micros": 1764584075333429, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333452) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2437389, prev total WAL file size 2437389, number of live WAL files 2.
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.334225) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1565KB)], [45(11MB)]
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075334258, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13556461, "oldest_snapshot_seqno": -1}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5315 keys, 13364739 bytes, temperature: kUnknown
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075429978, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13364739, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13329100, "index_size": 21257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 136251, "raw_average_key_size": 25, "raw_value_size": 13232520, "raw_average_value_size": 2489, "num_data_blocks": 865, "num_entries": 5315, "num_filter_entries": 5315, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.430251) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13364739 bytes
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.436998) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.5 rd, 139.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.4 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.8) write-amplify(8.3) OK, records in: 5849, records dropped: 534 output_compression: NoCompression
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.437028) EVENT_LOG_v1 {"time_micros": 1764584075437016, "job": 26, "event": "compaction_finished", "compaction_time_micros": 95818, "compaction_time_cpu_micros": 30928, "output_level": 6, "num_output_files": 1, "total_output_size": 13364739, "num_input_records": 5849, "num_output_records": 5315, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075437439, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075440008, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.334129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:14:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:35.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:36.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:36 compute-2 ceph-mon[76053]: pgmap v715: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 2.8 MiB/s wr, 21 op/s
Dec 01 10:14:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:38.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:38 compute-2 ceph-mon[76053]: pgmap v716: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 409 B/s wr, 1 op/s
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:14:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ab0000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:14:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:39.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:40.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:40 compute-2 ceph-mon[76053]: pgmap v717: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 409 B/s wr, 1 op/s
Dec 01 10:14:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101441 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:14:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:41 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90000fa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:41.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:14:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:42.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c000d00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:42 compute-2 ceph-mon[76053]: pgmap v718: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Dec 01 10:14:42 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:42.733 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:14:42 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:42.735 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:14:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:43 compute-2 sudo[233681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:14:43 compute-2 sudo[233681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:43 compute-2 sudo[233681]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:43.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.233 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.234 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.234 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 10:14:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:44 compute-2 nova_compute[230216]: 2025-12-01 10:14:44.248 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:44 compute-2 ceph-mon[76053]: pgmap v719: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:14:44 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:14:44.737 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:14:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:14:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:46.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:46.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:46 compute-2 ceph-mon[76053]: pgmap v720: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 01 10:14:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:47 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:47 compute-2 nova_compute[230216]: 2025-12-01 10:14:47.261 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:47 compute-2 nova_compute[230216]: 2025-12-01 10:14:47.262 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:14:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:47 compute-2 sudo[233712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:14:47 compute-2 sudo[233712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:47 compute-2 sudo[233712]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:48.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:48 compute-2 sudo[233737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:14:48 compute-2 sudo[233737]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:14:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:48.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:14:48 compute-2 nova_compute[230216]: 2025-12-01 10:14:48.200 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101448 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:14:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:48 compute-2 sudo[233737]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:48 compute-2 ceph-mon[76053]: pgmap v721: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 01 10:14:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:14:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:14:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:14:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:14:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:49 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.228 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.251 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.251 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.252 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.252 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:14:49 compute-2 podman[233796]: 2025-12-01 10:14:49.402969465 +0000 UTC m=+0.050804999 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:14:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:14:49 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/299261219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.746 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:14:49 compute-2 ceph-mon[76053]: pgmap v722: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:14:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:14:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:14:49 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:14:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1459856026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/299261219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.924 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.926 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5191MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.926 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:14:49 compute-2 nova_compute[230216]: 2025-12-01 10:14:49.927 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:14:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:50.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.081 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.082 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.148 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.165 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.166 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.182 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 10:14:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.249 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.264 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:14:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:50 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:14:50 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2880999167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.715 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.720 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.741 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.743 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:14:50 compute-2 nova_compute[230216]: 2025-12-01 10:14:50.743 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:14:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:50 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4241038156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1908518850' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2880999167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:51 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:51 compute-2 nova_compute[230216]: 2025-12-01 10:14:51.721 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:51 compute-2 nova_compute[230216]: 2025-12-01 10:14:51.721 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:14:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:52 compute-2 ceph-mon[76053]: pgmap v723: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 01 10:14:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/905717349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:14:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:52.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:52 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:52 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:53 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:53 compute-2 podman[233862]: 2025-12-01 10:14:53.407023013 +0000 UTC m=+0.061442374 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:14:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:14:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:14:54 compute-2 ceph-mon[76053]: pgmap v724: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1012 B/s rd, 276 B/s wr, 1 op/s
Dec 01 10:14:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:54.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:54 compute-2 sudo[233884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:14:54 compute-2 sudo[233884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:14:54 compute-2 sudo[233884]: pam_unix(sudo:session): session closed for user root
Dec 01 10:14:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101454 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:14:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:54 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:54 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:55 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:14:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:14:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:14:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:56.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:56.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:56 compute-2 irqbalance[789]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 01 10:14:56 compute-2 irqbalance[789]: IRQ 26 affinity is now unmanaged
Dec 01 10:14:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:56 compute-2 ceph-mon[76053]: pgmap v725: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1012 B/s rd, 276 B/s wr, 1 op/s
Dec 01 10:14:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:56 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:56 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:57 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:57 compute-2 ceph-mon[76053]: pgmap v726: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 276 B/s wr, 1 op/s
Dec 01 10:14:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:14:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:14:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:14:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:58.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:14:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:58 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:58 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:59 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:14:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:14:59 compute-2 ceph-mon[76053]: pgmap v727: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 184 B/s rd, 0 op/s
Dec 01 10:15:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:00.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:00 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:00 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84002f00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:01 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:01 compute-2 ceph-mon[76053]: pgmap v728: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:15:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:02.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:02 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1912336479' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:02 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:03 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84002f00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:03 compute-2 sudo[233918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:15:03 compute-2 sudo[233918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:15:03 compute-2 sudo[233918]: pam_unix(sudo:session): session closed for user root
Dec 01 10:15:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:03 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:15:03 compute-2 ceph-mon[76053]: pgmap v729: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:15:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:04.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:04.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:04 compute-2 podman[233944]: 2025-12-01 10:15:04.367281777 +0000 UTC m=+0.088712480 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 01 10:15:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:04 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:15:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:15:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:15:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:04 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:05 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:06.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:06.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:06 compute-2 ceph-mon[76053]: pgmap v730: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 01 10:15:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:15:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:07 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec 01 10:15:07 compute-2 ceph-mon[76053]: osdmap e142: 3 total, 3 up, 3 in
Dec 01 10:15:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2480622640' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:15:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2480622640' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:15:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:08.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:08.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:08 compute-2 ceph-mon[76053]: pgmap v732: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 716 B/s wr, 10 op/s
Dec 01 10:15:08 compute-2 ceph-mon[76053]: osdmap e143: 3 total, 3 up, 3 in
Dec 01 10:15:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:08 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:08 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:09 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:09 compute-2 ceph-mon[76053]: pgmap v734: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 895 B/s wr, 12 op/s
Dec 01 10:15:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:15:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:10.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:10.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:15:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:11 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:12.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:12.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:12 compute-2 ceph-mon[76053]: pgmap v735: 353 pgs: 353 active+clean; 72 MiB data, 209 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 1.3 MiB/s wr, 36 op/s
Dec 01 10:15:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2391718927' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:15:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:12 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/629626178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:15:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:12 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:13 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:14.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:14 compute-2 ceph-mon[76053]: pgmap v736: 353 pgs: 353 active+clean; 88 MiB data, 219 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 56 op/s
Dec 01 10:15:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:14 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:14 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:15 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec 01 10:15:15 compute-2 ceph-mon[76053]: pgmap v737: 353 pgs: 353 active+clean; 88 MiB data, 219 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 2.5 MiB/s wr, 40 op/s
Dec 01 10:15:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:16.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:16 compute-2 ceph-mon[76053]: osdmap e144: 3 total, 3 up, 3 in
Dec 01 10:15:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:16 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101516 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:15:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:16 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:17 compute-2 ceph-mon[76053]: pgmap v739: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 2.3 MiB/s wr, 43 op/s
Dec 01 10:15:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:18.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:18.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:18 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:18 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:19 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:19 compute-2 ceph-mon[76053]: pgmap v740: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 2.1 MiB/s wr, 40 op/s
Dec 01 10:15:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:20.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:20.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:20 compute-2 podman[233990]: 2025-12-01 10:15:20.392438307 +0000 UTC m=+0.050976214 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec 01 10:15:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:20 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8002920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:20 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:21 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:21 compute-2 ceph-mon[76053]: pgmap v741: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.2 MiB/s rd, 1.1 MiB/s wr, 66 op/s
Dec 01 10:15:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:22.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:22.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:22 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:22 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001300 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:23 compute-2 sudo[234013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:15:23 compute-2 sudo[234013]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:15:23 compute-2 sudo[234013]: pam_unix(sudo:session): session closed for user root
Dec 01 10:15:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:23 compute-2 ceph-mon[76053]: pgmap v742: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Dec 01 10:15:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:24.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:24.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:24 compute-2 podman[234038]: 2025-12-01 10:15:24.390135159 +0000 UTC m=+0.052802465 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 10:15:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:24 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8002920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:24 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:15:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:25 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:25 compute-2 ceph-mon[76053]: pgmap v743: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Dec 01 10:15:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:26 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:26 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8003910 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a780032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:28.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:28.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:28 compute-2 ceph-mon[76053]: pgmap v744: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 01 10:15:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:28 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 01 10:15:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:29 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8003a90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:29 compute-2 ceph-mon[76053]: pgmap v745: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Dec 01 10:15:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:15:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:30.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:15:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:30 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a780032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:30 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:31 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:31 compute-2 ceph-mon[76053]: pgmap v746: 353 pgs: 353 active+clean; 120 MiB data, 276 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 1.8 MiB/s wr, 116 op/s
Dec 01 10:15:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:32.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:32.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:33 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101533 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:15:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:33 compute-2 ceph-mon[76053]: pgmap v747: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 2.1 MiB/s wr, 97 op/s
Dec 01 10:15:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:34.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:34.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:34 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:34 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:35 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:35 compute-2 podman[234070]: 2025-12-01 10:15:35.447546585 +0000 UTC m=+0.097746638 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:15:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:35 compute-2 ceph-mon[76053]: pgmap v748: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 384 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:15:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:36.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:36.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:36 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:36 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:37 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:37 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:38 compute-2 ceph-mon[76053]: pgmap v749: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 384 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 01 10:15:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:38.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:38.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:40.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:40.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:40 compute-2 ceph-mon[76053]: pgmap v750: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 384 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 01 10:15:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:15:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:41 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:41 compute-2 ceph-mon[76053]: pgmap v751: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 01 10:15:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:15:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:42.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:15:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:42.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:42 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:43 compute-2 sudo[234105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:15:43 compute-2 sudo[234105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:15:43 compute-2 sudo[234105]: pam_unix(sudo:session): session closed for user root
Dec 01 10:15:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:15:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:43 compute-2 ceph-mon[76053]: pgmap v752: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 316 KiB/s wr, 20 op/s
Dec 01 10:15:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:44.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:44.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:45 compute-2 ceph-mon[76053]: pgmap v753: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.9 KiB/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:15:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:46.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:46.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:46 compute-2 nova_compute[230216]: 2025-12-01 10:15:46.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:46 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:46.738 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:15:46 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:46.739 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:15:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:47 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:15:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:47 compute-2 ceph-mon[76053]: pgmap v754: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.6 KiB/s rd, 17 KiB/s wr, 3 op/s
Dec 01 10:15:47 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:48.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:48 compute-2 nova_compute[230216]: 2025-12-01 10:15:48.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:48 compute-2 nova_compute[230216]: 2025-12-01 10:15:48.219 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:48 compute-2 nova_compute[230216]: 2025-12-01 10:15:48.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:15:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:48 compute-2 kernel: ganesha.nfsd[233976]: segfault at 50 ip 00007f8b596e932e sp 00007f8b24ff8210 error 4 in libntirpc.so.5.8[7f8b596ce000+2c000] likely on CPU 2 (core 0, socket 2)
Dec 01 10:15:48 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:15:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy ignored for local
Dec 01 10:15:48 compute-2 systemd[1]: Started Process Core Dump (PID 234135/UID 0).
Dec 01 10:15:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.226 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.251 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:15:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:15:49 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1790043656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.721 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:15:49 compute-2 systemd-coredump[234136]: Process 233510 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 54:
                                                    #0  0x00007f8b596e932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.894 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.896 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5165MB free_disk=59.942710876464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.897 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.897 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:15:49 compute-2 ceph-mon[76053]: pgmap v755: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 4.2 KiB/s wr, 2 op/s
Dec 01 10:15:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1790043656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.955 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.955 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:15:49 compute-2 nova_compute[230216]: 2025-12-01 10:15:49.969 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:15:49 compute-2 systemd[1]: systemd-coredump@17-234135-0.service: Deactivated successfully.
Dec 01 10:15:49 compute-2 systemd[1]: systemd-coredump@17-234135-0.service: Consumed 1.341s CPU time.
Dec 01 10:15:50 compute-2 podman[234166]: 2025-12-01 10:15:50.027238396 +0000 UTC m=+0.023925320 container died 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True)
Dec 01 10:15:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e-merged.mount: Deactivated successfully.
Dec 01 10:15:50 compute-2 podman[234166]: 2025-12-01 10:15:50.088044905 +0000 UTC m=+0.084731829 container remove 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325)
Dec 01 10:15:50 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:15:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:50.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:50.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:50 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:15:50 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.688s CPU time.
Dec 01 10:15:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:15:50 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/675847557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:50 compute-2 nova_compute[230216]: 2025-12-01 10:15:50.453 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:15:50 compute-2 nova_compute[230216]: 2025-12-01 10:15:50.460 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:15:50 compute-2 nova_compute[230216]: 2025-12-01 10:15:50.482 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:15:50 compute-2 nova_compute[230216]: 2025-12-01 10:15:50.485 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:15:50 compute-2 nova_compute[230216]: 2025-12-01 10:15:50.486 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:15:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/675847557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:51 compute-2 podman[234232]: 2025-12-01 10:15:51.404706059 +0000 UTC m=+0.057172745 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:15:51 compute-2 nova_compute[230216]: 2025-12-01 10:15:51.466 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:51 compute-2 nova_compute[230216]: 2025-12-01 10:15:51.466 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:51 compute-2 nova_compute[230216]: 2025-12-01 10:15:51.467 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:15:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:51 compute-2 ceph-mon[76053]: pgmap v756: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 5.2 KiB/s wr, 2 op/s
Dec 01 10:15:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1649284944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3930119811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:15:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:52.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:15:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:52.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/410573799' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1405939835' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:54 compute-2 ceph-mon[76053]: pgmap v757: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 2.0 KiB/s wr, 3 op/s
Dec 01 10:15:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/297688288' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:54.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:54.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:54 compute-2 sudo[234255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:15:54 compute-2 sudo[234255]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:15:54 compute-2 sudo[234255]: pam_unix(sudo:session): session closed for user root
Dec 01 10:15:54 compute-2 podman[234279]: 2025-12-01 10:15:54.523513539 +0000 UTC m=+0.049553150 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 01 10:15:54 compute-2 sudo[234286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:15:54 compute-2 sudo[234286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:15:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:55 compute-2 sudo[234286]: pam_unix(sudo:session): session closed for user root
Dec 01 10:15:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2861935731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:15:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:15:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:15:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:15:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:55 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:15:55.740 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:15:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:56.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:56.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:56 compute-2 ceph-mon[76053]: pgmap v758: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 1.9 KiB/s wr, 3 op/s
Dec 01 10:15:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:15:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94125d0 =====
Dec 01 10:15:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:15:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94125d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:15:58 compute-2 radosgw[82855]: beast: 0x7f23a94125d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:58.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:15:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:15:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:58.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:15:58 compute-2 ceph-mon[76053]: pgmap v759: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 2.0 KiB/s wr, 10 op/s
Dec 01 10:15:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:15:59 compute-2 sshd-session[234254]: Invalid user postgres from 45.78.219.119 port 53662
Dec 01 10:15:59 compute-2 ceph-mon[76053]: pgmap v760: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.5 KiB/s wr, 8 op/s
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3782279426' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:15:59 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:16:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:00.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:00.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:00 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 18.
Dec 01 10:16:00 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:16:00 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.688s CPU time.
Dec 01 10:16:00 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec 01 10:16:00 compute-2 podman[234408]: 2025-12-01 10:16:00.607711362 +0000 UTC m=+0.104599615 container create 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec 01 10:16:00 compute-2 podman[234408]: 2025-12-01 10:16:00.523976178 +0000 UTC m=+0.020864461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 01 10:16:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:16:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:16:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 01 10:16:00 compute-2 podman[234408]: 2025-12-01 10:16:00.691696943 +0000 UTC m=+0.188585226 container init 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:16:00 compute-2 podman[234408]: 2025-12-01 10:16:00.698013988 +0000 UTC m=+0.194902241 container start 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0)
Dec 01 10:16:00 compute-2 bash[234408]: 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 01 10:16:00 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 01 10:16:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 01 10:16:01 compute-2 ceph-mon[76053]: pgmap v761: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.6 KiB/s wr, 9 op/s
Dec 01 10:16:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:16:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:16:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:16:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:16:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/888313626' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:16:01 compute-2 ceph-mon[76053]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec 01 10:16:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:16:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:02.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:16:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec 01 10:16:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:02.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec 01 10:16:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:03 compute-2 ceph-mon[76053]: pgmap v762: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 39 op/s
Dec 01 10:16:03 compute-2 sshd-session[234254]: Received disconnect from 45.78.219.119 port 53662:11: Bye Bye [preauth]
Dec 01 10:16:03 compute-2 sshd-session[234254]: Disconnected from invalid user postgres 45.78.219.119 port 53662 [preauth]
Dec 01 10:16:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:03 compute-2 sudo[234470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:16:03 compute-2 sudo[234470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:16:03 compute-2 sudo[234470]: pam_unix(sudo:session): session closed for user root
Dec 01 10:16:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:04.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:04.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:16:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:16:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:16:05 compute-2 ceph-mon[76053]: pgmap v763: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 37 op/s
Dec 01 10:16:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:06.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:06.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:06 compute-2 sudo[234497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:16:06 compute-2 sudo[234497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:16:06 compute-2 sudo[234497]: pam_unix(sudo:session): session closed for user root
Dec 01 10:16:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:06 compute-2 podman[234521]: 2025-12-01 10:16:06.312617925 +0000 UTC m=+0.089212671 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:16:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:06 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 01 10:16:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:06 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 01 10:16:07 compute-2 ceph-mon[76053]: pgmap v764: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.9 MiB/s wr, 48 op/s
Dec 01 10:16:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:16:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:16:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3226630919' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:16:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3226630919' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:16:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:08.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:09 compute-2 ceph-mon[76053]: pgmap v765: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Dec 01 10:16:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:10.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:10.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:16:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:16:11 compute-2 ceph-mon[76053]: pgmap v766: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Dec 01 10:16:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:12.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:12.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:12 compute-2 ceph-mon[76053]: pgmap v767: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 105 op/s
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:14.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:16:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:14.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:16:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:14 compute-2 ceph-mon[76053]: pgmap v768: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 77 op/s
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.375783) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174375950, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1353, "num_deletes": 251, "total_data_size": 3227320, "memory_usage": 3275944, "flush_reason": "Manual Compaction"}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174388243, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2079191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24727, "largest_seqno": 26075, "table_properties": {"data_size": 2073500, "index_size": 3022, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12889, "raw_average_key_size": 20, "raw_value_size": 2061638, "raw_average_value_size": 3221, "num_data_blocks": 135, "num_entries": 640, "num_filter_entries": 640, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584076, "oldest_key_time": 1764584076, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 12516 microseconds, and 5581 cpu microseconds.
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.388326) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2079191 bytes OK
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.388358) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390232) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390252) EVENT_LOG_v1 {"time_micros": 1764584174390247, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390275) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 3220898, prev total WAL file size 3220898, number of live WAL files 2.
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.391444) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2030KB)], [48(12MB)]
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174391560, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15443930, "oldest_snapshot_seqno": -1}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5434 keys, 13260844 bytes, temperature: kUnknown
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174473100, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13260844, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13224569, "index_size": 21573, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139468, "raw_average_key_size": 25, "raw_value_size": 13126001, "raw_average_value_size": 2415, "num_data_blocks": 876, "num_entries": 5434, "num_filter_entries": 5434, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.473359) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13260844 bytes
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.474944) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.2 rd, 162.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.7 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.8) write-amplify(6.4) OK, records in: 5955, records dropped: 521 output_compression: NoCompression
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.474965) EVENT_LOG_v1 {"time_micros": 1764584174474955, "job": 28, "event": "compaction_finished", "compaction_time_micros": 81616, "compaction_time_cpu_micros": 27215, "output_level": 6, "num_output_files": 1, "total_output_size": 13260844, "num_input_records": 5955, "num_output_records": 5434, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174475353, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174477423, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.391308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:16:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:14 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:15 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101615 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 01 10:16:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:15 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:16.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:16.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:16 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:16 compute-2 ceph-mon[76053]: pgmap v769: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 78 op/s
Dec 01 10:16:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:17 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:17 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc0021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:18.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:18.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:18 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:18 compute-2 ceph-mon[76053]: pgmap v770: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 67 op/s
Dec 01 10:16:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:19 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:19 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:20.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:20.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:20 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:21 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:21 compute-2 ceph-mon[76053]: pgmap v771: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.4 KiB/s wr, 65 op/s
Dec 01 10:16:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:21 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:22.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:22.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:22 compute-2 podman[234580]: 2025-12-01 10:16:22.389354418 +0000 UTC m=+0.046251475 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:16:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:22 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:23 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:23 compute-2 ceph-mon[76053]: pgmap v772: 353 pgs: 353 active+clean; 198 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 120 op/s
Dec 01 10:16:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:23 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:23 compute-2 sudo[234601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:16:23 compute-2 sudo[234601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:16:23 compute-2 sudo[234601]: pam_unix(sudo:session): session closed for user root
Dec 01 10:16:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:24.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:24.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:24 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:25 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:25 compute-2 ceph-mon[76053]: pgmap v773: 353 pgs: 353 active+clean; 198 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 292 KiB/s rd, 2.1 MiB/s wr, 54 op/s
Dec 01 10:16:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:16:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:25 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:25 compute-2 podman[234627]: 2025-12-01 10:16:25.418878536 +0000 UTC m=+0.063570319 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 01 10:16:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:26.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:26.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:26 compute-2 ceph-mon[76053]: pgmap v774: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 302 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 01 10:16:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:26 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:27 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:27 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:28.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:28.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:28 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:28 compute-2 ceph-mon[76053]: pgmap v775: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 303 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 01 10:16:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:29 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:29 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:30.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:30.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:30 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:30 compute-2 ceph-mon[76053]: pgmap v776: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 303 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 01 10:16:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:31 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:31 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/391728244' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:32.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:32.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:32 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:33 compute-2 ceph-mon[76053]: pgmap v777: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 91 op/s
Dec 01 10:16:33 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4104340304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:33 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:33 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:34.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:16:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:34.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:16:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:34 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:35 compute-2 ceph-mon[76053]: pgmap v778: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 59 KiB/s wr, 36 op/s
Dec 01 10:16:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:35 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:35 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:36.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:36.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:36 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:37 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:37 compute-2 ceph-mon[76053]: pgmap v779: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 68 KiB/s wr, 37 op/s
Dec 01 10:16:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:37 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:37 compute-2 podman[234659]: 2025-12-01 10:16:37.429773387 +0000 UTC m=+0.088508250 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:16:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:38.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:38.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:38 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:39 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:39 compute-2 ceph-mon[76053]: pgmap v780: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 22 KiB/s wr, 30 op/s
Dec 01 10:16:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:39 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:40.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec 01 10:16:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:40.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec 01 10:16:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:16:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:40 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:41 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:41 compute-2 ceph-mon[76053]: pgmap v781: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 22 KiB/s wr, 29 op/s
Dec 01 10:16:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:41 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:42.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:16:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:42.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:16:42 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3976190664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:42 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:43 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:43 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:43 compute-2 ceph-mon[76053]: pgmap v782: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 24 KiB/s wr, 57 op/s
Dec 01 10:16:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:43 compute-2 sudo[234692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:16:43 compute-2 sudo[234692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:16:43 compute-2 sudo[234692]: pam_unix(sudo:session): session closed for user root
Dec 01 10:16:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:44.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:44.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:44 compute-2 ceph-mon[76053]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 10 KiB/s wr, 29 op/s
Dec 01 10:16:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:44 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:45 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:45 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:46.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:46.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:46 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:46 compute-2 ceph-mon[76053]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 10 KiB/s wr, 29 op/s
Dec 01 10:16:47 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:47.004 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:16:47 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:47.006 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:16:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:47 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:47 compute-2 nova_compute[230216]: 2025-12-01 10:16:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:47 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:48.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:48 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:49 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:49 compute-2 ceph-mon[76053]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:16:49 compute-2 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:49 compute-2 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:49 compute-2 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:16:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:49 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:50.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.224 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:16:50 compute-2 nova_compute[230216]: 2025-12-01 10:16:50.224 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:50 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:51 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:16:51.009 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:16:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:51 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:51 compute-2 ceph-mon[76053]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:16:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:51 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:16:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:16:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2043758752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.708 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.870 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.871 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5177MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.872 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.872 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.935 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.936 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:16:51 compute-2 nova_compute[230216]: 2025-12-01 10:16:51.958 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:16:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2043758752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:52.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:16:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:16:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:16:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1056125986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:52 compute-2 nova_compute[230216]: 2025-12-01 10:16:52.461 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:16:52 compute-2 nova_compute[230216]: 2025-12-01 10:16:52.467 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:16:52 compute-2 nova_compute[230216]: 2025-12-01 10:16:52.490 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:16:52 compute-2 nova_compute[230216]: 2025-12-01 10:16:52.491 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:16:52 compute-2 nova_compute[230216]: 2025-12-01 10:16:52.491 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:16:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:52 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:53 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:53 compute-2 ceph-mon[76053]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:16:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1056125986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3552851747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:53 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:53 compute-2 podman[234771]: 2025-12-01 10:16:53.394523301 +0000 UTC m=+0.048311524 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 01 10:16:53 compute-2 nova_compute[230216]: 2025-12-01 10:16:53.492 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:53 compute-2 nova_compute[230216]: 2025-12-01 10:16:53.492 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:16:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:54.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:54.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3807992003' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2764423619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:54 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:55 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:55 compute-2 ceph-mon[76053]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:16:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2087643151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:16:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:16:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:55 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:56.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:56.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:56 compute-2 podman[234794]: 2025-12-01 10:16:56.402463221 +0000 UTC m=+0.056410384 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 01 10:16:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:56 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:57 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:57 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:57 compute-2 ceph-mon[76053]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:16:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:58.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:16:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:16:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:58.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:16:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:16:58 compute-2 ceph-mon[76053]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 01 10:16:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:58 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:59 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:59 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:16:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:16:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:00.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:00.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:17:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:00 compute-2 ceph-mon[76053]: pgmap v791: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:17:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:01 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 01 10:17:01 compute-2 kernel: ganesha.nfsd[234717]: segfault at 50 ip 00007fe8750e632e sp 00007fe825ffa210 error 4 in libntirpc.so.5.8[7fe8750cb000+2c000] likely on CPU 2 (core 0, socket 2)
Dec 01 10:17:01 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 01 10:17:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:01 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy ignored for local
Dec 01 10:17:01 compute-2 systemd[1]: Started Process Core Dump (PID 234819/UID 0).
Dec 01 10:17:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/545670811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:02.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:02.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:02 compute-2 ceph-mon[76053]: pgmap v792: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 01 10:17:03 compute-2 systemd-coredump[234820]: Process 234428 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 56:
                                                    #0  0x00007fe8750e632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 01 10:17:03 compute-2 systemd[1]: systemd-coredump@18-234819-0.service: Deactivated successfully.
Dec 01 10:17:03 compute-2 systemd[1]: systemd-coredump@18-234819-0.service: Consumed 1.804s CPU time.
Dec 01 10:17:03 compute-2 podman[234826]: 2025-12-01 10:17:03.214419092 +0000 UTC m=+0.034947787 container died 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec 01 10:17:03 compute-2 systemd[1]: var-lib-containers-storage-overlay-6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49-merged.mount: Deactivated successfully.
Dec 01 10:17:03 compute-2 podman[234826]: 2025-12-01 10:17:03.261111896 +0000 UTC m=+0.081640571 container remove 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 01 10:17:03 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec 01 10:17:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:03 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:17:03 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.083s CPU time.
Dec 01 10:17:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:03 compute-2 sudo[234872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:17:03 compute-2 sudo[234872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:03 compute-2 sudo[234872]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:04.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:04.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:17:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:17:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:17:04 compute-2 ceph-mon[76053]: pgmap v793: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 01 10:17:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:06.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:06.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:06 compute-2 sudo[234899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:17:06 compute-2 sudo[234899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:06 compute-2 sudo[234899]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:06 compute-2 sudo[234924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:17:06 compute-2 sudo[234924]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:06 compute-2 sudo[234924]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:06 compute-2 ceph-mon[76053]: pgmap v794: 353 pgs: 353 active+clean; 75 MiB data, 266 MiB used, 60 GiB / 60 GiB avail; 7.8 KiB/s rd, 1.2 MiB/s wr, 15 op/s
Dec 01 10:17:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:17:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:17:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101707 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:17:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:17:07 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:17:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:08.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:08.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:08 compute-2 podman[234982]: 2025-12-01 10:17:08.435707668 +0000 UTC m=+0.090839258 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:17:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:09 compute-2 ceph-mon[76053]: pgmap v795: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:17:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2144203208' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:17:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:10.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1701952053' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:17:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:17:10 compute-2 ceph-mon[76053]: pgmap v796: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:17:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:12.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:12 compute-2 ceph-mon[76053]: pgmap v797: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Dec 01 10:17:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:13 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 19.
Dec 01 10:17:13 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:17:13 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.083s CPU time.
Dec 01 10:17:13 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Start request repeated too quickly.
Dec 01 10:17:13 compute-2 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec 01 10:17:13 compute-2 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec 01 10:17:13 compute-2 sudo[235016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:17:13 compute-2 sudo[235016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:13 compute-2 sudo[235016]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:14.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:14.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:17:14 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:17:14 compute-2 ceph-mon[76053]: pgmap v798: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Dec 01 10:17:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:16.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:16.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:16 compute-2 ceph-mon[76053]: pgmap v799: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 76 op/s
Dec 01 10:17:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:18.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:18.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:18 compute-2 ceph-mon[76053]: pgmap v800: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 588 KiB/s wr, 85 op/s
Dec 01 10:17:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:20.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:20.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:20 compute-2 ceph-mon[76053]: pgmap v801: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 01 10:17:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:22.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:22.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:22 compute-2 ceph-mon[76053]: pgmap v802: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:17:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:24 compute-2 sudo[235051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:17:24 compute-2 sudo[235051]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:24 compute-2 sudo[235051]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:24 compute-2 podman[235075]: 2025-12-01 10:17:24.15073743 +0000 UTC m=+0.057973952 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 10:17:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:24.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:25 compute-2 ceph-mon[76053]: pgmap v803: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 0 B/s wr, 69 op/s
Dec 01 10:17:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:17:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101726 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:17:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:27 compute-2 ceph-mon[76053]: pgmap v804: 353 pgs: 353 active+clean; 114 MiB data, 292 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 1.7 MiB/s wr, 111 op/s
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101727 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/101727 (4) : haproxy version is 2.3.17-d1c9119
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/101727 (4) : path to executable is /usr/local/sbin/haproxy
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [ALERT] 334/101727 (4) : backend 'backend' has no server available!
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:27 compute-2 podman[235099]: 2025-12-01 10:17:27.412515611 +0000 UTC m=+0.067491195 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:17:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:28.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:29 compute-2 ceph-mon[76053]: pgmap v805: 353 pgs: 353 active+clean; 121 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 1009 KiB/s rd, 2.1 MiB/s wr, 85 op/s
Dec 01 10:17:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:30.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:30.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:31 compute-2 ceph-mon[76053]: pgmap v806: 353 pgs: 353 active+clean; 121 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 318 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Dec 01 10:17:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:32.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:32.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:33 compute-2 ceph-mon[76053]: pgmap v807: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 01 10:17:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:34.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:34.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:35 compute-2 ceph-mon[76053]: pgmap v808: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 01 10:17:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:36.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:36.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:37 compute-2 ceph-mon[76053]: pgmap v809: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:17:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:38.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:38.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:39 compute-2 ceph-mon[76053]: pgmap v810: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 138 KiB/s rd, 444 KiB/s wr, 21 op/s
Dec 01 10:17:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:39 compute-2 podman[235133]: 2025-12-01 10:17:39.429682683 +0000 UTC m=+0.085172548 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 01 10:17:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:17:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:40.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:41 compute-2 ceph-mon[76053]: pgmap v811: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 7.6 KiB/s rd, 12 KiB/s wr, 2 op/s
Dec 01 10:17:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:42.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:42 compute-2 ceph-mon[76053]: pgmap v812: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 8.1 KiB/s rd, 15 KiB/s wr, 3 op/s
Dec 01 10:17:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:44 compute-2 sudo[235164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:17:44 compute-2 sudo[235164]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:17:44 compute-2 sudo[235164]: pam_unix(sudo:session): session closed for user root
Dec 01 10:17:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:17:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:44.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:17:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:44 compute-2 ceph-mon[76053]: pgmap v813: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 3.2 KiB/s wr, 1 op/s
Dec 01 10:17:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:46.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:46.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:46 compute-2 ceph-mon[76053]: pgmap v814: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 7.2 KiB/s wr, 2 op/s
Dec 01 10:17:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:47 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:47.682 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:17:47 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:47.683 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:17:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:48.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:48.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:48 compute-2 ceph-mon[76053]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 9.0 KiB/s rd, 6.2 KiB/s wr, 2 op/s
Dec 01 10:17:49 compute-2 nova_compute[230216]: 2025-12-01 10:17:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:49 compute-2 nova_compute[230216]: 2025-12-01 10:17:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:50 compute-2 nova_compute[230216]: 2025-12-01 10:17:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:50 compute-2 nova_compute[230216]: 2025-12-01 10:17:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:17:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:50.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:51 compute-2 ceph-mon[76053]: pgmap v816: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 8.1 KiB/s rd, 6.2 KiB/s wr, 2 op/s
Dec 01 10:17:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/120430286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.419 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.419 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.447 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:17:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:17:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4190207475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:51 compute-2 nova_compute[230216]: 2025-12-01 10:17:51.940 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.088 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.089 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5237MB free_disk=59.942684173583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.090 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.090 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.153 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.154 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:17:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4190207475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.173 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:17:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:52.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:52.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:17:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/320669018' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.647 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.654 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.676 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.678 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:17:52 compute-2 nova_compute[230216]: 2025-12-01 10:17:52.678 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:17:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:53 compute-2 ceph-mon[76053]: pgmap v817: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 7.3 KiB/s wr, 30 op/s
Dec 01 10:17:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/320669018' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:53 compute-2 nova_compute[230216]: 2025-12-01 10:17:53.467 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:53 compute-2 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:53 compute-2 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:17:53 compute-2 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:17:53 compute-2 nova_compute[230216]: 2025-12-01 10:17:53.485 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:17:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:54 compute-2 nova_compute[230216]: 2025-12-01 10:17:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:54 compute-2 nova_compute[230216]: 2025-12-01 10:17:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:17:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/408947741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:54 compute-2 podman[235243]: 2025-12-01 10:17:54.40064916 +0000 UTC m=+0.050403647 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 01 10:17:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:55 compute-2 ceph-mon[76053]: pgmap v818: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 5.2 KiB/s wr, 29 op/s
Dec 01 10:17:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2807815281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:17:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:56.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:17:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:17:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1773807356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:56 compute-2 ceph-mon[76053]: pgmap v819: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 5.2 KiB/s wr, 29 op/s
Dec 01 10:17:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:56 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:17:56.685 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:17:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2141696538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:17:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:17:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:58.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:17:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:17:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:17:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:58.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:17:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:58 compute-2 ceph-mon[76053]: pgmap v820: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:17:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:17:58 compute-2 podman[235266]: 2025-12-01 10:17:58.412968665 +0000 UTC m=+0.065665390 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 01 10:17:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:17:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:00.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:00.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:00 compute-2 ceph-mon[76053]: pgmap v821: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:18:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:02.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:02.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:02 compute-2 ceph-mon[76053]: pgmap v822: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:18:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:18:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:04.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:18:04 compute-2 sudo[235293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:18:04 compute-2 sudo[235293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:04 compute-2 sudo[235293]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:18:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.704 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:18:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.704 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:18:04 compute-2 ceph-mon[76053]: pgmap v823: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:18:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:06.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:06.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:06 compute-2 ceph-mon[76053]: pgmap v824: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:18:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2069587201' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:18:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2069587201' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:18:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:08.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:08.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:09 compute-2 ceph-mon[76053]: pgmap v825: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:18:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:18:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:10.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:10.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:10 compute-2 podman[235324]: 2025-12-01 10:18:10.416409611 +0000 UTC m=+0.071369051 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 10:18:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:11 compute-2 ceph-mon[76053]: pgmap v826: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:18:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:12.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:12.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:13 compute-2 ceph-mon[76053]: pgmap v827: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:18:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:13 compute-2 sudo[235355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:18:13 compute-2 sudo[235355]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:13 compute-2 sudo[235355]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:13 compute-2 sudo[235380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:18:13 compute-2 sudo[235380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3815150366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:14.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:14.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:14 compute-2 sudo[235380]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:15 compute-2 ceph-mon[76053]: pgmap v828: 353 pgs: 353 active+clean; 41 MiB data, 258 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:18:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:18:16 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:18:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:16.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:16.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:17 compute-2 ceph-mon[76053]: pgmap v829: 353 pgs: 353 active+clean; 68 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.1 MiB/s wr, 26 op/s
Dec 01 10:18:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:18.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:18.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:18 compute-2 ceph-mon[76053]: pgmap v830: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:18:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3893309357' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:18:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:20.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:20.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:20 compute-2 ceph-mon[76053]: pgmap v831: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:18:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2545059359' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:18:21 compute-2 sudo[235442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:18:21 compute-2 sudo[235442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:21 compute-2 sudo[235442]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:21 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:21 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:18:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:22.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:22.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:23 compute-2 ceph-mon[76053]: pgmap v832: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:18:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:24.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:24.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:24 compute-2 sudo[235471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:18:24 compute-2 sudo[235471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:24 compute-2 sudo[235471]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:24 compute-2 podman[235495]: 2025-12-01 10:18:24.513549885 +0000 UTC m=+0.056520309 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:18:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:25 compute-2 ceph-mon[76053]: pgmap v833: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:18:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:18:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:26.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:26.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:26 compute-2 ceph-mon[76053]: pgmap v834: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.2 MiB/s rd, 1.8 MiB/s wr, 76 op/s
Dec 01 10:18:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:28.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:28.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:29 compute-2 ceph-mon[76053]: pgmap v835: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 739 KiB/s wr, 75 op/s
Dec 01 10:18:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:29 compute-2 podman[235523]: 2025-12-01 10:18:29.39240372 +0000 UTC m=+0.051765943 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 01 10:18:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:30.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:30 compute-2 ceph-mon[76053]: pgmap v836: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:18:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:30.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:31 compute-2 sshd-session[235520]: Received disconnect from 45.78.219.119 port 45196:11: Bye Bye [preauth]
Dec 01 10:18:31 compute-2 sshd-session[235520]: Disconnected from authenticating user root 45.78.219.119 port 45196 [preauth]
Dec 01 10:18:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:32.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:32.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:32 compute-2 ceph-mon[76053]: pgmap v837: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:18:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:34.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:34.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:35 compute-2 ceph-mon[76053]: pgmap v838: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:18:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:36.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:36.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:37 compute-2 ceph-mon[76053]: pgmap v839: 353 pgs: 353 active+clean; 115 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 1.4 MiB/s wr, 131 op/s
Dec 01 10:18:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:38.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:38.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:39 compute-2 ceph-mon[76053]: pgmap v840: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 2.1 MiB/s wr, 89 op/s
Dec 01 10:18:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:18:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:40.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:40.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:41 compute-2 ceph-mon[76053]: pgmap v841: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:18:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:41 compute-2 podman[235556]: 2025-12-01 10:18:41.474045886 +0000 UTC m=+0.135174961 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 10:18:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:42.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:42 compute-2 ceph-mon[76053]: pgmap v842: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:18:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:42.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:44.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:44.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:44 compute-2 sudo[235586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:18:44 compute-2 sudo[235586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:18:44 compute-2 sudo[235586]: pam_unix(sudo:session): session closed for user root
Dec 01 10:18:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:44 compute-2 ceph-mon[76053]: pgmap v843: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:18:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:46.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:46.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:46 compute-2 ceph-mon[76053]: pgmap v844: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 333 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:18:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:48.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:48.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:49 compute-2 ceph-mon[76053]: pgmap v845: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 722 KiB/s wr, 8 op/s
Dec 01 10:18:49 compute-2 nova_compute[230216]: 2025-12-01 10:18:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:50 compute-2 nova_compute[230216]: 2025-12-01 10:18:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:50 compute-2 nova_compute[230216]: 2025-12-01 10:18:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:50 compute-2 nova_compute[230216]: 2025-12-01 10:18:50.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:18:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:50.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:50.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:51 compute-2 nova_compute[230216]: 2025-12-01 10:18:51.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:51 compute-2 ceph-mon[76053]: pgmap v846: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 11 KiB/s wr, 1 op/s
Dec 01 10:18:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:52 compute-2 nova_compute[230216]: 2025-12-01 10:18:52.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:52.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:18:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:53 compute-2 ceph-mon[76053]: pgmap v847: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.7 KiB/s rd, 17 KiB/s wr, 1 op/s
Dec 01 10:18:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:18:53 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3744640447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.717 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:18:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.890 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5236MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:18:53 compute-2 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.246 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.247 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.270 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:18:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:18:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:54.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:18:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3744640447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:54 compute-2 ceph-mon[76053]: pgmap v848: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 17 KiB/s wr, 1 op/s
Dec 01 10:18:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:18:54 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2793003043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.886 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.893 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.916 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.919 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:18:54 compute-2 nova_compute[230216]: 2025-12-01 10:18:54.920 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:18:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:55 compute-2 podman[235667]: 2025-12-01 10:18:55.392450064 +0000 UTC m=+0.050875001 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:18:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:18:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/839179454' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2612711508' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2793003043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:55 compute-2 nova_compute[230216]: 2025-12-01 10:18:55.921 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:55 compute-2 nova_compute[230216]: 2025-12-01 10:18:55.922 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:18:55 compute-2 nova_compute[230216]: 2025-12-01 10:18:55.922 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:18:55 compute-2 nova_compute[230216]: 2025-12-01 10:18:55.943 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:18:55 compute-2 nova_compute[230216]: 2025-12-01 10:18:55.943 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:56 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:18:56.066 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:18:56 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:18:56.067 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:18:56 compute-2 nova_compute[230216]: 2025-12-01 10:18:56.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:18:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:56.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:56.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2054872021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:56 compute-2 ceph-mon[76053]: pgmap v849: 353 pgs: 353 active+clean; 134 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 662 KiB/s wr, 12 op/s
Dec 01 10:18:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2091576883' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:18:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:18:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:18:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:18:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:58.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:18:58 compute-2 ceph-mon[76053]: pgmap v850: 353 pgs: 353 active+clean; 137 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 7.1 KiB/s rd, 661 KiB/s wr, 12 op/s
Dec 01 10:18:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:18:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3707188935' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:00.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:00.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:00 compute-2 podman[235691]: 2025-12-01 10:19:00.399414955 +0000 UTC m=+0.058537048 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:19:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:01 compute-2 ceph-mon[76053]: pgmap v851: 353 pgs: 353 active+clean; 137 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 6.8 KiB/s rd, 661 KiB/s wr, 12 op/s
Dec 01 10:19:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3765291121' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:19:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3067651324' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:19:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:02.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:02.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:03 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:19:03.070 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:19:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:03 compute-2 ceph-mon[76053]: pgmap v852: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Dec 01 10:19:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:04.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:04.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:04 compute-2 ceph-mon[76053]: pgmap v853: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:19:04 compute-2 sudo[235715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:19:04 compute-2 sudo[235715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:04 compute-2 sudo[235715]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.705 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:19:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.706 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:19:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.707 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:19:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:06.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:06.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:06 compute-2 ceph-mon[76053]: pgmap v854: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Dec 01 10:19:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:19:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:19:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:19:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:19:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:19:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:19:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:08.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:08.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:09 compute-2 ceph-mon[76053]: pgmap v855: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 MiB/s wr, 26 op/s
Dec 01 10:19:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:10.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:10.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:19:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:11 compute-2 ceph-mon[76053]: pgmap v856: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.1 MiB/s wr, 25 op/s
Dec 01 10:19:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:12.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:12.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:12 compute-2 podman[235748]: 2025-12-01 10:19:12.489248691 +0000 UTC m=+0.141942525 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 01 10:19:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:12 compute-2 ceph-mon[76053]: pgmap v857: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.1 MiB/s wr, 90 op/s
Dec 01 10:19:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:14.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:14.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:14 compute-2 ceph-mon[76053]: pgmap v858: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 01 10:19:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:16.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:16.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:17 compute-2 ceph-mon[76053]: pgmap v859: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 01 10:19:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:19:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5239 writes, 27K keys, 5239 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5239 writes, 5239 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1525 writes, 7043 keys, 1525 commit groups, 1.0 writes per commit group, ingest: 17.12 MB, 0.03 MB/s
                                           Interval WAL: 1525 writes, 1525 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    125.2      0.31              0.11        14    0.022       0      0       0.0       0.0
                                             L6      1/0   12.65 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.3    134.1    116.0      1.44              0.45        13    0.111     67K   6761       0.0       0.0
                                            Sum      1/0   12.65 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.3    110.4    117.7      1.75              0.56        27    0.065     67K   6761       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     87.3     86.8      0.69              0.16         8    0.086     23K   2076       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    134.1    116.0      1.44              0.45        13    0.111     67K   6761       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    126.2      0.31              0.11        13    0.024       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.038, interval 0.008
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.20 GB write, 0.11 MB/s write, 0.19 GB read, 0.11 MB/s read, 1.7 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.7 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 13.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000135 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(713,12.98 MB,4.26849%) FilterBlock(27,201.92 KB,0.0648649%) IndexBlock(27,350.20 KB,0.112498%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 10:19:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:18.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:18.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:19 compute-2 ceph-mon[76053]: pgmap v860: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 3.0 KiB/s wr, 71 op/s
Dec 01 10:19:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:20.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:20 compute-2 ceph-mon[76053]: pgmap v861: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.7 KiB/s wr, 64 op/s
Dec 01 10:19:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:20.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:21 compute-2 sudo[235783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:19:21 compute-2 sudo[235783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:21 compute-2 sudo[235783]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:21 compute-2 sudo[235808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 01 10:19:21 compute-2 sudo[235808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:21 compute-2 sudo[235808]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:21 compute-2 sudo[235854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:19:21 compute-2 sudo[235854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:21 compute-2 sudo[235854]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:21 compute-2 sudo[235879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:19:21 compute-2 sudo[235879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:22.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:22 compute-2 sudo[235879]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:22 compute-2 sudo[235934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:19:22 compute-2 sudo[235934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:22 compute-2 sudo[235934]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:22.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:22 compute-2 sudo[235959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 365f19c2-81e5-5edd-b6b4-280555214d3a -- inventory --format=json-pretty --filter-for-batch
Dec 01 10:19:22 compute-2 sudo[235959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:22 compute-2 ceph-mon[76053]: pgmap v862: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 128 op/s
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.896139232 +0000 UTC m=+0.041710565 container create 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 01 10:19:22 compute-2 systemd[1]: Started libpod-conmon-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope.
Dec 01 10:19:22 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.875737732 +0000 UTC m=+0.021309095 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.972901847 +0000 UTC m=+0.118473220 container init 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.979420847 +0000 UTC m=+0.124992180 container start 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.98360824 +0000 UTC m=+0.129179593 container attach 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 01 10:19:22 compute-2 intelligent_banach[236042]: 167 167
Dec 01 10:19:22 compute-2 systemd[1]: libpod-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope: Deactivated successfully.
Dec 01 10:19:22 compute-2 podman[236026]: 2025-12-01 10:19:22.990101829 +0000 UTC m=+0.135673172 container died 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:19:23 compute-2 systemd[1]: var-lib-containers-storage-overlay-71bd18c8e6c8e351a61c259f217b8011b5921085eeb41f1a0af490767e4da751-merged.mount: Deactivated successfully.
Dec 01 10:19:23 compute-2 podman[236026]: 2025-12-01 10:19:23.03330785 +0000 UTC m=+0.178879193 container remove 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:19:23 compute-2 systemd[1]: libpod-conmon-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope: Deactivated successfully.
Dec 01 10:19:23 compute-2 podman[236068]: 2025-12-01 10:19:23.188637853 +0000 UTC m=+0.040529656 container create 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 01 10:19:23 compute-2 systemd[1]: Started libpod-conmon-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope.
Dec 01 10:19:23 compute-2 systemd[1]: Started libcrun container.
Dec 01 10:19:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 01 10:19:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 01 10:19:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 01 10:19:23 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 01 10:19:23 compute-2 podman[236068]: 2025-12-01 10:19:23.261511692 +0000 UTC m=+0.113403515 container init 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec 01 10:19:23 compute-2 podman[236068]: 2025-12-01 10:19:23.171911572 +0000 UTC m=+0.023803405 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 01 10:19:23 compute-2 podman[236068]: 2025-12-01 10:19:23.268855052 +0000 UTC m=+0.120746855 container start 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:19:23 compute-2 podman[236068]: 2025-12-01 10:19:23.272457721 +0000 UTC m=+0.124349554 container attach 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 01 10:19:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:24 compute-2 vigilant_tu[236085]: [
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:     {
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "available": false,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "being_replaced": false,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "ceph_device_lvm": false,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "lsm_data": {},
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "lvs": [],
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "path": "/dev/sr0",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "rejected_reasons": [
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "Has a FileSystem",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "Insufficient space (<5GB)"
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         ],
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         "sys_api": {
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "actuators": null,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "device_nodes": [
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:                 "sr0"
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             ],
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "devname": "sr0",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "human_readable_size": "482.00 KB",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "id_bus": "ata",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "model": "QEMU DVD-ROM",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "nr_requests": "2",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "parent": "/dev/sr0",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "partitions": {},
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "path": "/dev/sr0",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "removable": "1",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "rev": "2.5+",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "ro": "0",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "rotational": "1",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "sas_address": "",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "sas_device_handle": "",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "scheduler_mode": "mq-deadline",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "sectors": 0,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "sectorsize": "2048",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "size": 493568.0,
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "support_discard": "2048",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "type": "disk",
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:             "vendor": "QEMU"
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:         }
Dec 01 10:19:24 compute-2 vigilant_tu[236085]:     }
Dec 01 10:19:24 compute-2 vigilant_tu[236085]: ]
Dec 01 10:19:24 compute-2 systemd[1]: libpod-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope: Deactivated successfully.
Dec 01 10:19:24 compute-2 podman[237429]: 2025-12-01 10:19:24.14942988 +0000 UTC m=+0.038333181 container died 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec 01 10:19:24 compute-2 systemd[1]: var-lib-containers-storage-overlay-537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739-merged.mount: Deactivated successfully.
Dec 01 10:19:24 compute-2 podman[237429]: 2025-12-01 10:19:24.18729391 +0000 UTC m=+0.076197181 container remove 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 01 10:19:24 compute-2 systemd[1]: libpod-conmon-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope: Deactivated successfully.
Dec 01 10:19:24 compute-2 sudo[235959]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:24.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:24.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:24 compute-2 sudo[237444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:19:24 compute-2 sudo[237444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:24 compute-2 sudo[237444]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: pgmap v863: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:19:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:19:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:19:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:19:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:19:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:26.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:26 compute-2 podman[237471]: 2025-12-01 10:19:26.447491949 +0000 UTC m=+0.084061735 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 10:19:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:27 compute-2 ceph-mon[76053]: pgmap v864: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:19:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:28.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:28 compute-2 ceph-mon[76053]: pgmap v865: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:19:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:30 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2475863935' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:30.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:30.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:31 compute-2 ceph-mon[76053]: pgmap v866: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:19:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:31 compute-2 podman[237496]: 2025-12-01 10:19:31.425662824 +0000 UTC m=+0.082545818 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 01 10:19:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:31 compute-2 sudo[237518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:19:31 compute-2 sudo[237518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:31 compute-2 sudo[237518]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:32.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:32.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:32 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:19:32 compute-2 ceph-mon[76053]: pgmap v867: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 345 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Dec 01 10:19:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:34.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:34 compute-2 ceph-mon[76053]: pgmap v868: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 28 op/s
Dec 01 10:19:34 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1410185725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:36.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:36.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:37 compute-2 ceph-mon[76053]: pgmap v869: 353 pgs: 353 active+clean; 66 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 15 KiB/s wr, 55 op/s
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.395656) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377395805, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 251, "total_data_size": 6375435, "memory_usage": 6475872, "flush_reason": "Manual Compaction"}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377421440, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4091309, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26081, "largest_seqno": 28438, "table_properties": {"data_size": 4081955, "index_size": 5848, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19577, "raw_average_key_size": 20, "raw_value_size": 4063125, "raw_average_value_size": 4206, "num_data_blocks": 256, "num_entries": 966, "num_filter_entries": 966, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584174, "oldest_key_time": 1764584174, "file_creation_time": 1764584377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 25840 microseconds, and 8345 cpu microseconds.
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.421521) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4091309 bytes OK
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.421544) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423125) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423140) EVENT_LOG_v1 {"time_micros": 1764584377423135, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6365039, prev total WAL file size 6365039, number of live WAL files 2.
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.424410) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3995KB)], [51(12MB)]
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377424475, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17352153, "oldest_snapshot_seqno": -1}
Dec 01 10:19:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5880 keys, 15145326 bytes, temperature: kUnknown
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377962748, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 15145326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15104929, "index_size": 24607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 149317, "raw_average_key_size": 25, "raw_value_size": 14997512, "raw_average_value_size": 2550, "num_data_blocks": 1006, "num_entries": 5880, "num_filter_entries": 5880, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.963055) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 15145326 bytes
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.973745) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 32.2 rd, 28.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.6 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6400, records dropped: 520 output_compression: NoCompression
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.973801) EVENT_LOG_v1 {"time_micros": 1764584377973782, "job": 30, "event": "compaction_finished", "compaction_time_micros": 538357, "compaction_time_cpu_micros": 30911, "output_level": 6, "num_output_files": 1, "total_output_size": 15145326, "num_input_records": 6400, "num_output_records": 5880, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377974868, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377977163, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.424324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:37 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:19:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:38.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:38.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:38 compute-2 ceph-mon[76053]: pgmap v870: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 2.7 KiB/s wr, 56 op/s
Dec 01 10:19:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:19:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:40.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:40 compute-2 ceph-mon[76053]: pgmap v871: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Dec 01 10:19:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:42.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:42.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:42 compute-2 ceph-mon[76053]: pgmap v872: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 2.3 KiB/s wr, 56 op/s
Dec 01 10:19:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:43 compute-2 podman[237554]: 2025-12-01 10:19:43.427878008 +0000 UTC m=+0.081907612 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 10:19:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:44.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:44 compute-2 sudo[237582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:19:44 compute-2 sudo[237582]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:19:44 compute-2 sudo[237582]: pam_unix(sudo:session): session closed for user root
Dec 01 10:19:45 compute-2 ceph-mon[76053]: pgmap v873: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:19:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:19:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:46.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:19:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:47 compute-2 ceph-mon[76053]: pgmap v874: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:19:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:48 compute-2 nova_compute[230216]: 2025-12-01 10:19:48.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:48 compute-2 nova_compute[230216]: 2025-12-01 10:19:48.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 10:19:48 compute-2 nova_compute[230216]: 2025-12-01 10:19:48.323 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 10:19:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:48.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:48 compute-2 ceph-mon[76053]: pgmap v875: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 B/s wr, 1 op/s
Dec 01 10:19:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:48.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:50 compute-2 nova_compute[230216]: 2025-12-01 10:19:50.323 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:50.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:50.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:50 compute-2 ceph-mon[76053]: pgmap v876: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:19:51 compute-2 nova_compute[230216]: 2025-12-01 10:19:51.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:52 compute-2 nova_compute[230216]: 2025-12-01 10:19:52.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:52 compute-2 nova_compute[230216]: 2025-12-01 10:19:52.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:52 compute-2 nova_compute[230216]: 2025-12-01 10:19:52.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:52 compute-2 nova_compute[230216]: 2025-12-01 10:19:52.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:19:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:52.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:19:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:52.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:19:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:53 compute-2 ceph-mon[76053]: pgmap v877: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.242 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:19:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:19:53 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/92302176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.703 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:19:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.876 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.878 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5242MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.878 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:19:53 compute-2 nova_compute[230216]: 2025-12-01 10:19:53.879 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.088 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.088 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.164 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.236 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.237 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.253 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.274 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.287 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:19:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:54.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/92302176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:19:54 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1443106930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.711 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.718 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:19:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.731 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.733 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:19:54 compute-2 nova_compute[230216]: 2025-12-01 10:19:54.734 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:19:55 compute-2 nova_compute[230216]: 2025-12-01 10:19:55.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:55 compute-2 nova_compute[230216]: 2025-12-01 10:19:55.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:19:55 compute-2 nova_compute[230216]: 2025-12-01 10:19:55.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:19:55 compute-2 nova_compute[230216]: 2025-12-01 10:19:55.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:19:55 compute-2 nova_compute[230216]: 2025-12-01 10:19:55.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:55 compute-2 ceph-mon[76053]: pgmap v878: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:19:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:19:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1443106930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:56 compute-2 nova_compute[230216]: 2025-12-01 10:19:56.216 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:56 compute-2 nova_compute[230216]: 2025-12-01 10:19:56.217 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:56 compute-2 nova_compute[230216]: 2025-12-01 10:19:56.217 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 10:19:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:56.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2796817309' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:56 compute-2 ceph-mon[76053]: pgmap v879: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:19:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:19:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:56.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:19:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:57 compute-2 nova_compute[230216]: 2025-12-01 10:19:57.222 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:19:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:57 compute-2 podman[237664]: 2025-12-01 10:19:57.406932266 +0000 UTC m=+0.053772910 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 10:19:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1156827399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:19:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.2 total, 600.0 interval
                                           Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s
                                           Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 10:19:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:58.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:58 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/129369375' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:58 compute-2 ceph-mon[76053]: pgmap v880: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:19:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:19:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:19:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:58.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:19:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:19:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:19:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3861310351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:19:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:00.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:00.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:00 compute-2 ceph-mon[76053]: pgmap v881: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:20:00 compute-2 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Dec 01 10:20:00 compute-2 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec 01 10:20:00 compute-2 ceph-mon[76053]:      osd.2 observed slow operation indications in BlueStore
Dec 01 10:20:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3098762632' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2855581033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:20:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2472683310' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:20:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:02.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:02 compute-2 podman[237689]: 2025-12-01 10:20:02.41031897 +0000 UTC m=+0.068561814 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 01 10:20:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:02.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:02 compute-2 ceph-mon[76053]: pgmap v882: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:20:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:20:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:04.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:20:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:04.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.707 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:20:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:20:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:20:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:04 compute-2 ceph-mon[76053]: pgmap v883: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:20:04 compute-2 sudo[237712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:20:04 compute-2 sudo[237712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:04 compute-2 sudo[237712]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:06.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:06.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:07 compute-2 ceph-mon[76053]: pgmap v884: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.5 MiB/s rd, 1.8 MiB/s wr, 83 op/s
Dec 01 10:20:07 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:07.127 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:20:07 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:07.128 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:20:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3326335386' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:20:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3326335386' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:20:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:08.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:08.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:09 compute-2 ceph-mon[76053]: pgmap v885: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:20:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:20:10 compute-2 ceph-mon[76053]: pgmap v886: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:20:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:10.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:10.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:12.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:20:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:12.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:20:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:13 compute-2 ceph-mon[76053]: pgmap v887: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:20:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:14 compute-2 podman[237747]: 2025-12-01 10:20:14.445736892 +0000 UTC m=+0.094247726 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 01 10:20:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:14.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:15 compute-2 ceph-mon[76053]: pgmap v888: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:20:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:16.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:16.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:17 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:20:17.130 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:20:17 compute-2 ceph-mon[76053]: pgmap v889: 353 pgs: 353 active+clean; 104 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 MiB/s wr, 86 op/s
Dec 01 10:20:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:18.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:18.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:19 compute-2 ceph-mon[76053]: pgmap v890: 353 pgs: 353 active+clean; 109 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 448 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 01 10:20:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:20.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:21 compute-2 ceph-mon[76053]: pgmap v891: 353 pgs: 353 active+clean; 109 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 1.8 MiB/s wr, 19 op/s
Dec 01 10:20:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:22.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:23 compute-2 ceph-mon[76053]: pgmap v892: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 356 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:20:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:24.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:24.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:24 compute-2 ceph-mon[76053]: pgmap v893: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 355 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:20:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:20:25 compute-2 sudo[237785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:20:25 compute-2 sudo[237785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:25 compute-2 sudo[237785]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:26.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:26.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:26 compute-2 ceph-mon[76053]: pgmap v894: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 356 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:20:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:28 compute-2 podman[237814]: 2025-12-01 10:20:28.392476915 +0000 UTC m=+0.054292254 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:20:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:28.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:28.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:29 compute-2 ceph-mon[76053]: pgmap v895: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 338 KiB/s rd, 976 KiB/s wr, 53 op/s
Dec 01 10:20:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:30.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:31 compute-2 ceph-mon[76053]: pgmap v896: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 398 KiB/s wr, 46 op/s
Dec 01 10:20:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:31 compute-2 sudo[237838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:20:31 compute-2 sudo[237838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:31 compute-2 sudo[237838]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:32 compute-2 sudo[237863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:20:32 compute-2 sudo[237863]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:20:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:32.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:20:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:32.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:32 compute-2 sudo[237863]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:33 compute-2 ceph-mon[76053]: pgmap v897: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 400 KiB/s wr, 46 op/s
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:20:33 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:20:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:33 compute-2 podman[237919]: 2025-12-01 10:20:33.412222485 +0000 UTC m=+0.066145883 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 01 10:20:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:34.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:34.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:35 compute-2 ceph-mon[76053]: pgmap v898: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 14 KiB/s wr, 1 op/s
Dec 01 10:20:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:20:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:36.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:20:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:36.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:37 compute-2 ceph-mon[76053]: pgmap v899: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 14 KiB/s wr, 1 op/s
Dec 01 10:20:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:38.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:38.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:38 compute-2 sudo[237944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:20:38 compute-2 sudo[237944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:38 compute-2 sudo[237944]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:39 compute-2 ceph-mon[76053]: pgmap v900: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 2.3 KiB/s wr, 1 op/s
Dec 01 10:20:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:20:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:20:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:40 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2403187449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:20:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:40.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:40.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:41 compute-2 ceph-mon[76053]: pgmap v901: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 2.0 KiB/s wr, 1 op/s
Dec 01 10:20:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:20:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:42.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:20:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:42.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:43 compute-2 ceph-mon[76053]: pgmap v902: 353 pgs: 353 active+clean; 167 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Dec 01 10:20:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:20:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:44.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:20:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:20:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:44.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:20:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2540559284' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:20:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/786623811' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:20:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:45 compute-2 sudo[237975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:20:45 compute-2 sudo[237975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:20:45 compute-2 sudo[237975]: pam_unix(sudo:session): session closed for user root
Dec 01 10:20:45 compute-2 podman[237999]: 2025-12-01 10:20:45.289686723 +0000 UTC m=+0.081336721 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 01 10:20:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:45 compute-2 ceph-mon[76053]: pgmap v903: 353 pgs: 353 active+clean; 167 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Dec 01 10:20:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:46 compute-2 nova_compute[230216]: 2025-12-01 10:20:46.412 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:46.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:46.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:46 compute-2 ceph-mon[76053]: pgmap v904: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:20:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:48.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:20:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:48.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:20:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:48 compute-2 ceph-mon[76053]: pgmap v905: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:20:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:50 compute-2 nova_compute[230216]: 2025-12-01 10:20:50.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:20:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:20:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:50.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:51 compute-2 ceph-mon[76053]: pgmap v906: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:20:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:20:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:52.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:20:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:20:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:52.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:20:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:53 compute-2 ceph-mon[76053]: pgmap v907: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Dec 01 10:20:53 compute-2 nova_compute[230216]: 2025-12-01 10:20:53.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:53 compute-2 nova_compute[230216]: 2025-12-01 10:20:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:54 compute-2 nova_compute[230216]: 2025-12-01 10:20:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:54 compute-2 nova_compute[230216]: 2025-12-01 10:20:54.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:20:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:54.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:54.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:55 compute-2 ceph-mon[76053]: pgmap v908: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 75 op/s
Dec 01 10:20:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.908 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:20:55 compute-2 nova_compute[230216]: 2025-12-01 10:20:55.910 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:20:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:20:56 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1710096170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.332 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:20:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:56.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.482 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.483 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5240MB free_disk=59.92176818847656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.483 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.484 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:20:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:56.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:20:56 compute-2 nova_compute[230216]: 2025-12-01 10:20:56.661 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:20:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:56 compute-2 ceph-mon[76053]: pgmap v909: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 75 op/s
Dec 01 10:20:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1710096170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:20:57 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2948582203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:57 compute-2 nova_compute[230216]: 2025-12-01 10:20:57.109 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:20:57 compute-2 nova_compute[230216]: 2025-12-01 10:20:57.115 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:20:57 compute-2 nova_compute[230216]: 2025-12-01 10:20:57.136 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:20:57 compute-2 nova_compute[230216]: 2025-12-01 10:20:57.137 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:20:57 compute-2 nova_compute[230216]: 2025-12-01 10:20:57.138 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:20:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec 01 10:20:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec 01 10:20:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec 01 10:20:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2948582203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1242197287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.138 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.139 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.139 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.165 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.165 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:58 compute-2 nova_compute[230216]: 2025-12-01 10:20:58.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:20:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:58.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:20:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:20:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:58.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:20:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:20:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:58 compute-2 ceph-mon[76053]: pgmap v910: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 4.3 KiB/s wr, 65 op/s
Dec 01 10:20:58 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3258920012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:20:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:20:59 compute-2 podman[238085]: 2025-12-01 10:20:59.393134774 +0000 UTC m=+0.054179937 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 10:20:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:00.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:00.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:01 compute-2 ceph-mon[76053]: pgmap v911: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 4.3 KiB/s wr, 65 op/s
Dec 01 10:21:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:01 compute-2 anacron[141921]: Job `cron.daily' started
Dec 01 10:21:01 compute-2 anacron[141921]: Job `cron.daily' terminated
Dec 01 10:21:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:02.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/581613495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:02 compute-2 ceph-mon[76053]: pgmap v912: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 284 op/s
Dec 01 10:21:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:02.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2550727021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:04 compute-2 podman[238113]: 2025-12-01 10:21:04.438171334 +0000 UTC m=+0.079477302 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:21:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:04.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:04.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:21:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.709 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:21:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.709 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:21:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:04 compute-2 ceph-mon[76053]: pgmap v913: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.1 MiB/s wr, 219 op/s
Dec 01 10:21:05 compute-2 sudo[238134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:21:05 compute-2 sudo[238134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:05 compute-2 sudo[238134]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:06.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:06.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:07 compute-2 ceph-mon[76053]: pgmap v914: 353 pgs: 353 active+clean; 200 MiB data, 352 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.1 MiB/s wr, 219 op/s
Dec 01 10:21:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1514814098' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:21:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1514814098' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:21:08 compute-2 sshd-session[238133]: Connection closed by 45.78.219.119 port 46974 [preauth]
Dec 01 10:21:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:08.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:08 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:08.597 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:21:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:08 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:08.598 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:21:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:09 compute-2 ceph-mon[76053]: pgmap v915: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 219 op/s
Dec 01 10:21:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:21:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:10.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:10.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:11 compute-2 ceph-mon[76053]: pgmap v916: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 219 op/s
Dec 01 10:21:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:11 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:11.601 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:21:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:12.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:12.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:13 compute-2 ceph-mon[76053]: pgmap v917: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 419 KiB/s rd, 2.2 MiB/s wr, 219 op/s
Dec 01 10:21:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:14.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:14.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:15 compute-2 ceph-mon[76053]: pgmap v918: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 13 KiB/s wr, 1 op/s
Dec 01 10:21:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:15 compute-2 podman[238171]: 2025-12-01 10:21:15.429376359 +0000 UTC m=+0.087796162 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.473964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475474095, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1219, "num_deletes": 256, "total_data_size": 2766588, "memory_usage": 2807088, "flush_reason": "Manual Compaction"}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475501736, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1819441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28443, "largest_seqno": 29657, "table_properties": {"data_size": 1814239, "index_size": 2598, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11246, "raw_average_key_size": 19, "raw_value_size": 1803652, "raw_average_value_size": 3057, "num_data_blocks": 116, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584378, "oldest_key_time": 1764584378, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 27822 microseconds, and 4518 cpu microseconds.
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.501807) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1819441 bytes OK
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.501854) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503284) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503297) EVENT_LOG_v1 {"time_micros": 1764584475503292, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503317) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 2760727, prev total WAL file size 2760727, number of live WAL files 2.
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.504120) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373535' seq:0, type:0; will stop at (end)
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1776KB)], [54(14MB)]
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475504191, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16964767, "oldest_snapshot_seqno": -1}
Dec 01 10:21:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5944 keys, 16845725 bytes, temperature: kUnknown
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475872334, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16845725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16802911, "index_size": 26832, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14917, "raw_key_size": 151846, "raw_average_key_size": 25, "raw_value_size": 16692453, "raw_average_value_size": 2808, "num_data_blocks": 1098, "num_entries": 5944, "num_filter_entries": 5944, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.872657) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16845725 bytes
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.874651) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 46.1 rd, 45.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 14.4 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(18.6) write-amplify(9.3) OK, records in: 6470, records dropped: 526 output_compression: NoCompression
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.874690) EVENT_LOG_v1 {"time_micros": 1764584475874668, "job": 32, "event": "compaction_finished", "compaction_time_micros": 368229, "compaction_time_cpu_micros": 36147, "output_level": 6, "num_output_files": 1, "total_output_size": 16845725, "num_input_records": 6470, "num_output_records": 5944, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475875156, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475878579, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.504031) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878678) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:15 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:21:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:16.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:16 compute-2 ceph-mon[76053]: pgmap v919: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:21:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:16.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:18.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:18.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:18 compute-2 ceph-mon[76053]: pgmap v920: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 14 KiB/s wr, 1 op/s
Dec 01 10:21:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:20.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:20.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:20 compute-2 ceph-mon[76053]: pgmap v921: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 3.3 KiB/s wr, 0 op/s
Dec 01 10:21:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:22.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:23 compute-2 ceph-mon[76053]: pgmap v922: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 7.7 KiB/s wr, 1 op/s
Dec 01 10:21:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:23 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:24.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:25 compute-2 ceph-mon[76053]: pgmap v923: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 7.6 KiB/s wr, 1 op/s
Dec 01 10:21:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:21:25 compute-2 sudo[238208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:21:25 compute-2 sudo[238208]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:25 compute-2 sudo[238208]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:26 compute-2 ceph-mon[76053]: pgmap v924: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 766 B/s rd, 9.0 KiB/s wr, 2 op/s
Dec 01 10:21:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:26.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:26.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:28.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:29 compute-2 ceph-mon[76053]: pgmap v925: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 766 B/s rd, 6.3 KiB/s wr, 1 op/s
Dec 01 10:21:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:30 compute-2 podman[238237]: 2025-12-01 10:21:30.416470593 +0000 UTC m=+0.066755778 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 01 10:21:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:30.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:31 compute-2 ceph-mon[76053]: pgmap v926: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 5.7 KiB/s wr, 1 op/s
Dec 01 10:21:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:32.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:33 compute-2 ceph-mon[76053]: pgmap v927: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 766 B/s rd, 9.0 KiB/s wr, 2 op/s
Dec 01 10:21:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:34.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:35 compute-2 ceph-mon[76053]: pgmap v928: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 4.7 KiB/s wr, 1 op/s
Dec 01 10:21:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:35 compute-2 podman[238261]: 2025-12-01 10:21:35.401206996 +0000 UTC m=+0.056480667 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 01 10:21:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:36.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:36.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:37 compute-2 ceph-mon[76053]: pgmap v929: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 7.3 KiB/s wr, 1 op/s
Dec 01 10:21:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:38.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:38.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:38 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:39 compute-2 sudo[238284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:21:39 compute-2 sudo[238284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:39 compute-2 sudo[238284]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:39 compute-2 sudo[238309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:21:39 compute-2 sudo[238309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:39 compute-2 ceph-mon[76053]: pgmap v930: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 6.0 KiB/s wr, 1 op/s
Dec 01 10:21:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:39 compute-2 sudo[238309]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:21:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:40.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:40.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:41 compute-2 ceph-mon[76053]: pgmap v931: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 6.0 KiB/s wr, 1 op/s
Dec 01 10:21:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:42.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:42.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:43 compute-2 ceph-mon[76053]: pgmap v932: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 8.0 KiB/s wr, 2 op/s
Dec 01 10:21:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:44 compute-2 ceph-mon[76053]: pgmap v933: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 4.7 KiB/s wr, 1 op/s
Dec 01 10:21:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:44.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:44.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:45 compute-2 sudo[238374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:21:45 compute-2 sudo[238374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:45 compute-2 sudo[238374]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:45 compute-2 podman[238398]: 2025-12-01 10:21:45.64536063 +0000 UTC m=+0.090265447 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:21:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:45 compute-2 sudo[238425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:21:45 compute-2 sudo[238425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:21:45 compute-2 sudo[238425]: pam_unix(sudo:session): session closed for user root
Dec 01 10:21:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:46.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:46.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:21:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:21:46 compute-2 ceph-mon[76053]: pgmap v934: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 14 KiB/s wr, 2 op/s
Dec 01 10:21:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:48 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:48.304 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:21:48 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:48.307 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:21:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:48.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:48.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:48 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:49 compute-2 ceph-mon[76053]: pgmap v935: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 11 KiB/s wr, 2 op/s
Dec 01 10:21:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1601500382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:50.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:50.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:51 compute-2 nova_compute[230216]: 2025-12-01 10:21:51.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:51 compute-2 ceph-mon[76053]: pgmap v936: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 11 KiB/s wr, 2 op/s
Dec 01 10:21:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:21:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:52.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:21:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:52.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:53 compute-2 nova_compute[230216]: 2025-12-01 10:21:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:53 compute-2 ceph-mon[76053]: pgmap v937: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 15 KiB/s wr, 30 op/s
Dec 01 10:21:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:54.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:54.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:55 compute-2 nova_compute[230216]: 2025-12-01 10:21:55.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:55 compute-2 nova_compute[230216]: 2025-12-01 10:21:55.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:55 compute-2 ceph-mon[76053]: pgmap v938: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 29 op/s
Dec 01 10:21:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:21:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:56 compute-2 ceph-mon[76053]: pgmap v939: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 14 KiB/s wr, 30 op/s
Dec 01 10:21:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:56.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.091 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.135 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.135 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.136 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.136 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.137 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:21:57 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:21:57.309 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:21:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:21:57 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3382881795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.596 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:21:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3382881795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.762 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5228MB free_disk=59.94247817993164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.824 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.825 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:21:57 compute-2 nova_compute[230216]: 2025-12-01 10:21:57.845 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:21:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:21:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/16920269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.272 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.280 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.327 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.329 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.330 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:21:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.446 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.446 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.447 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.462 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:21:58 compute-2 nova_compute[230216]: 2025-12-01 10:21:58.462 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 01 10:21:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:58.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 01 10:21:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:21:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:21:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:58.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:21:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:59 compute-2 nova_compute[230216]: 2025-12-01 10:21:59.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:21:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:21:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:21:59 compute-2 ceph-mon[76053]: pgmap v940: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 4.5 KiB/s wr, 28 op/s
Dec 01 10:21:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3813778074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/16920269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:21:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1203186481' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:00 compute-2 ceph-mon[76053]: pgmap v941: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 4.5 KiB/s wr, 28 op/s
Dec 01 10:22:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:22:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:00.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:22:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:22:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:00.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:22:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:01 compute-2 podman[238509]: 2025-12-01 10:22:01.393564094 +0000 UTC m=+0.051373634 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:22:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/596614569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:02.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 01 10:22:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:02.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 01 10:22:02 compute-2 ceph-mon[76053]: pgmap v942: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 5.7 KiB/s wr, 56 op/s
Dec 01 10:22:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/427530599' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:04.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:04.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:22:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:22:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:22:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:05 compute-2 ceph-mon[76053]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Dec 01 10:22:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/421893025' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:05 compute-2 sudo[238533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:22:05 compute-2 sudo[238533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:05 compute-2 sudo[238533]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:05 compute-2 podman[238557]: 2025-12-01 10:22:05.726916781 +0000 UTC m=+0.053365071 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 01 10:22:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:06.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:06 compute-2 ceph-mon[76053]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Dec 01 10:22:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:06.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2051891042' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:22:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2051891042' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:22:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:08.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:08 compute-2 ceph-mon[76053]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:22:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:08.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:22:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:10.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:10 compute-2 ceph-mon[76053]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:22:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:12.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:12.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:13 compute-2 ceph-mon[76053]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:22:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:14.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:14.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:15 compute-2 ceph-mon[76053]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:16 compute-2 podman[238588]: 2025-12-01 10:22:16.430005509 +0000 UTC m=+0.089330622 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:22:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:22:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:16.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:22:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:16.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:17 compute-2 ceph-mon[76053]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:22:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:18 compute-2 ceph-mon[76053]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:18.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:18.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:19 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:20.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:20.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:21 compute-2 ceph-mon[76053]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000071s ======
Dec 01 10:22:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:22.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000071s
Dec 01 10:22:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:22.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:23 compute-2 ceph-mon[76053]: pgmap v952: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:22:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:24 compute-2 ceph-mon[76053]: pgmap v953: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:24.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:24.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:22:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:25 compute-2 sudo[238625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:22:25 compute-2 sudo[238625]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:25 compute-2 sudo[238625]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:26 compute-2 ceph-mon[76053]: pgmap v954: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:22:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:26.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:26.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/238511788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:28.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:28.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:28 compute-2 ceph-mon[76053]: pgmap v955: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:30.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:30.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:31 compute-2 ceph-mon[76053]: pgmap v956: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:22:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2667232419' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:22:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:32 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/761622790' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:22:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:32 compute-2 podman[238656]: 2025-12-01 10:22:32.386484351 +0000 UTC m=+0.048982883 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 10:22:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:32.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:32.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:33 compute-2 ceph-mon[76053]: pgmap v957: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:22:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:34 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:34 compute-2 ceph-mon[76053]: pgmap v958: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:22:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:34.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:22:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:34.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:22:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:36 compute-2 podman[238680]: 2025-12-01 10:22:36.419475748 +0000 UTC m=+0.072659984 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 01 10:22:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:36.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:36.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:37 compute-2 ceph-mon[76053]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 878 KiB/s rd, 1.8 MiB/s wr, 66 op/s
Dec 01 10:22:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:38.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:38.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:39 compute-2 ceph-mon[76053]: pgmap v960: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 878 KiB/s rd, 1.8 MiB/s wr, 65 op/s
Dec 01 10:22:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:39 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:22:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:40.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:40.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:41 compute-2 ceph-mon[76053]: pgmap v961: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 878 KiB/s rd, 1.8 MiB/s wr, 65 op/s
Dec 01 10:22:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:42.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:42.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:43 compute-2 ceph-mon[76053]: pgmap v962: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:22:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1338427231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:44 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:44.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:44.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:45 compute-2 ceph-mon[76053]: pgmap v963: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:22:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:45 compute-2 sudo[238713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:22:45 compute-2 sudo[238713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:45 compute-2 sudo[238713]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:46 compute-2 sudo[238738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:22:46 compute-2 sudo[238738]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:46 compute-2 sudo[238738]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:46 compute-2 sudo[238763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:22:46 compute-2 sudo[238763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:46 compute-2 ceph-mon[76053]: pgmap v964: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Dec 01 10:22:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:22:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:46.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:22:46 compute-2 sudo[238763]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:46.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:46 compute-2 sudo[238820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:22:46 compute-2 sudo[238820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:46 compute-2 sudo[238820]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:46 compute-2 sudo[238851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 01 10:22:46 compute-2 sudo[238851]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:46 compute-2 podman[238844]: 2025-12-01 10:22:46.894563932 +0000 UTC m=+0.089906490 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 10:22:47 compute-2 sudo[238851]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:22:48 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:22:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:48.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:49 compute-2 ceph-mon[76053]: pgmap v965: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 1.2 KiB/s wr, 62 op/s
Dec 01 10:22:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:50 compute-2 ceph-mon[76053]: pgmap v966: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 1.2 KiB/s wr, 62 op/s
Dec 01 10:22:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:50.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:50.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:52 compute-2 sudo[238921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:22:52 compute-2 sudo[238921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:22:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:52.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:52 compute-2 sudo[238921]: pam_unix(sudo:session): session closed for user root
Dec 01 10:22:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:52.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:53 compute-2 nova_compute[230216]: 2025-12-01 10:22:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:53 compute-2 ceph-mon[76053]: pgmap v967: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 1.2 KiB/s wr, 62 op/s
Dec 01 10:22:53 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:53 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:22:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4254081221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:54 compute-2 nova_compute[230216]: 2025-12-01 10:22:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:54 compute-2 ceph-mon[76053]: pgmap v968: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Dec 01 10:22:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:54.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:54.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:22:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:56 compute-2 nova_compute[230216]: 2025-12-01 10:22:56.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:56 compute-2 ceph-mon[76053]: pgmap v969: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 1.8 MiB/s wr, 54 op/s
Dec 01 10:22:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2132823220' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:22:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:56.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:56.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.221 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:57 compute-2 nova_compute[230216]: 2025-12-01 10:22:57.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:22:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/212026365' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:22:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.229 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.229 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:22:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:22:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:58.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:22:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:22:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:22:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:58.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:22:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:22:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2629360791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.806 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:22:58 compute-2 ceph-mon[76053]: pgmap v970: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.985 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.986 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5220MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.986 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:22:58 compute-2 nova_compute[230216]: 2025-12-01 10:22:58.987 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.062 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.094 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:22:59 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:22:59.282 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:22:59 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:22:59.283 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:22:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:22:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:22:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4240160464' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.528 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.533 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:22:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.552 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.555 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:22:59 compute-2 nova_compute[230216]: 2025-12-01 10:22:59.556 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:22:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2629360791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4240160464' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1200241194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.285927) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580286072, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1640, "num_deletes": 502, "total_data_size": 3418147, "memory_usage": 3472608, "flush_reason": "Manual Compaction"}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580338024, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2094203, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29663, "largest_seqno": 31297, "table_properties": {"data_size": 2087844, "index_size": 3049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17625, "raw_average_key_size": 19, "raw_value_size": 2072866, "raw_average_value_size": 2334, "num_data_blocks": 132, "num_entries": 888, "num_filter_entries": 888, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584475, "oldest_key_time": 1764584475, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 52170 microseconds, and 6343 cpu microseconds.
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:23:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.338116) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2094203 bytes OK
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.338142) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378430) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378507) EVENT_LOG_v1 {"time_micros": 1764584580378491, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378541) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3409621, prev total WAL file size 3409621, number of live WAL files 2.
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.379543) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2045KB)], [57(16MB)]
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580379640, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18939928, "oldest_snapshot_seqno": -1}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5816 keys, 12756481 bytes, temperature: kUnknown
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580498148, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12756481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12719362, "index_size": 21457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150302, "raw_average_key_size": 25, "raw_value_size": 12616027, "raw_average_value_size": 2169, "num_data_blocks": 859, "num_entries": 5816, "num_filter_entries": 5816, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.498461) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12756481 bytes
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.510266) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.7 rd, 107.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.1 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.1) write-amplify(6.1) OK, records in: 6832, records dropped: 1016 output_compression: NoCompression
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.510324) EVENT_LOG_v1 {"time_micros": 1764584580510303, "job": 34, "event": "compaction_finished", "compaction_time_micros": 118608, "compaction_time_cpu_micros": 27392, "output_level": 6, "num_output_files": 1, "total_output_size": 12756481, "num_input_records": 6832, "num_output_records": 5816, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580511277, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580516000, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.379444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516070) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:23:00 compute-2 nova_compute[230216]: 2025-12-01 10:23:00.556 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:00 compute-2 nova_compute[230216]: 2025-12-01 10:23:00.557 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:00.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:01 compute-2 ceph-mon[76053]: pgmap v971: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:23:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1107339476' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3547558712' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:02.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:02.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:03 compute-2 podman[239001]: 2025-12-01 10:23:03.386569132 +0000 UTC m=+0.047502988 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 01 10:23:03 compute-2 ceph-mon[76053]: pgmap v972: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:23:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2044304106' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:04 compute-2 ceph-mon[76053]: pgmap v973: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:23:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:23:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.711 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:23:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.711 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:23:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:04.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:05 compute-2 sudo[239024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:23:05 compute-2 sudo[239024]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:05 compute-2 sudo[239024]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:06.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:06.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:07 compute-2 podman[239050]: 2025-12-01 10:23:07.394920978 +0000 UTC m=+0.049843085 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 01 10:23:07 compute-2 ceph-mon[76053]: pgmap v974: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Dec 01 10:23:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/503124220' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:23:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/503124220' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:23:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:08 compute-2 ceph-mon[76053]: pgmap v975: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 80 op/s
Dec 01 10:23:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:08.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:08.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:09 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:23:09.284 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:23:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:23:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:10.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:10.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:10 compute-2 ceph-mon[76053]: pgmap v976: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 80 op/s
Dec 01 10:23:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3760003608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:12.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:12.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:12 compute-2 ceph-mon[76053]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Dec 01 10:23:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:14.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:14.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:15 compute-2 ceph-mon[76053]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 90 op/s
Dec 01 10:23:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:16.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:16.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:17 compute-2 ceph-mon[76053]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 90 op/s
Dec 01 10:23:17 compute-2 podman[239080]: 2025-12-01 10:23:17.425406451 +0000 UTC m=+0.084845866 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 01 10:23:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:18.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:18 compute-2 ceph-mon[76053]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 1.1 KiB/s wr, 20 op/s
Dec 01 10:23:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:18.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:19 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:20 compute-2 ceph-mon[76053]: pgmap v981: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 1.1 KiB/s wr, 20 op/s
Dec 01 10:23:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:20.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:20.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:22.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:22.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:23 compute-2 ceph-mon[76053]: pgmap v982: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 1.1 KiB/s wr, 21 op/s
Dec 01 10:23:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:24.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:25 compute-2 ceph-mon[76053]: pgmap v983: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:23:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:26 compute-2 sudo[239117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:23:26 compute-2 sudo[239117]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:26 compute-2 sudo[239117]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:26 compute-2 ceph-mon[76053]: pgmap v984: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:23:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:26.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:26.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:28.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:28.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:29 compute-2 ceph-mon[76053]: pgmap v985: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:30.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:30.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:31 compute-2 ceph-mon[76053]: pgmap v986: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:32.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:32.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:33 compute-2 ceph-mon[76053]: pgmap v987: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:23:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:34 compute-2 podman[239150]: 2025-12-01 10:23:34.387440854 +0000 UTC m=+0.048488313 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 10:23:34 compute-2 ceph-mon[76053]: pgmap v988: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:34.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:34.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:34 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:36.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:36.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:37 compute-2 ceph-mon[76053]: pgmap v989: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:23:37 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2076139831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:23:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:38 compute-2 podman[239175]: 2025-12-01 10:23:38.403975841 +0000 UTC m=+0.060419905 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:23:38 compute-2 ceph-mon[76053]: pgmap v990: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:38.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:38.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:23:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:39 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:40.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:40 compute-2 ceph-mon[76053]: pgmap v991: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:23:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:40.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2614357069' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:23:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/649594026' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:23:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:42.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:42.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:43 compute-2 ceph-mon[76053]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:23:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:44.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:44.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:44 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:45 compute-2 ceph-mon[76053]: pgmap v993: 353 pgs: 353 active+clean; 88 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:23:45 compute-2 sshd-session[239173]: Connection closed by 45.78.219.119 port 44820 [preauth]
Dec 01 10:23:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:46 compute-2 sudo[239203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:23:46 compute-2 sudo[239203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:46 compute-2 sudo[239203]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:46 compute-2 ceph-mon[76053]: pgmap v994: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 97 op/s
Dec 01 10:23:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:46.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:23:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:46.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:23:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:48 compute-2 podman[239230]: 2025-12-01 10:23:48.427703668 +0000 UTC m=+0.086651990 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 01 10:23:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:48.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:48.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:49 compute-2 ceph-mon[76053]: pgmap v995: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 96 op/s
Dec 01 10:23:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:50.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:50.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:51 compute-2 ceph-mon[76053]: pgmap v996: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 96 op/s
Dec 01 10:23:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:52.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:52 compute-2 sudo[239260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:23:52 compute-2 sudo[239260]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:52 compute-2 sudo[239260]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:52 compute-2 sudo[239285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 10:23:52 compute-2 sudo[239285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:52.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:53 compute-2 nova_compute[230216]: 2025-12-01 10:23:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:53 compute-2 ceph-mon[76053]: pgmap v997: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:23:53 compute-2 podman[239383]: 2025-12-01 10:23:53.342526379 +0000 UTC m=+0.073420315 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 01 10:23:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:53 compute-2 podman[239383]: 2025-12-01 10:23:53.46109929 +0000 UTC m=+0.191993226 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 01 10:23:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:53 compute-2 podman[239501]: 2025-12-01 10:23:53.984051365 +0000 UTC m=+0.067855028 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:23:54 compute-2 podman[239501]: 2025-12-01 10:23:54.023009332 +0000 UTC m=+0.106812975 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:23:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 10:23:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:54 compute-2 podman[239640]: 2025-12-01 10:23:54.526087457 +0000 UTC m=+0.052594893 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:23:54 compute-2 podman[239640]: 2025-12-01 10:23:54.536999335 +0000 UTC m=+0.063506771 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:23:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:54.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:54 compute-2 podman[239706]: 2025-12-01 10:23:54.7403868 +0000 UTC m=+0.050033740 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.tags=Ceph keepalived, vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., description=keepalived for Ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, name=keepalived, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 01 10:23:54 compute-2 podman[239706]: 2025-12-01 10:23:54.753994055 +0000 UTC m=+0.063640965 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec 01 10:23:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:54.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:23:54 compute-2 sudo[239285]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:55 compute-2 nova_compute[230216]: 2025-12-01 10:23:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:55 compute-2 sudo[239776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:23:55 compute-2 sudo[239776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:55 compute-2 sudo[239776]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:55 compute-2 sudo[239801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:23:55 compute-2 sudo[239801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:23:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:55 compute-2 ceph-mon[76053]: pgmap v998: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:23:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:23:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:55 compute-2 sudo[239801]: pam_unix(sudo:session): session closed for user root
Dec 01 10:23:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:56 compute-2 nova_compute[230216]: 2025-12-01 10:23:56.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:56.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:23:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:56.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 01 10:23:56 compute-2 ceph-mon[76053]: pgmap v999: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 76 op/s
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:23:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:23:57 compute-2 nova_compute[230216]: 2025-12-01 10:23:57.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:58 compute-2 ceph-mon[76053]: pgmap v1000: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 168 KiB/s rd, 7 op/s
Dec 01 10:23:58 compute-2 ceph-mon[76053]: Health check failed: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec 01 10:23:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:23:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:23:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:23:58 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:23:58 compute-2 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:23:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000072s ======
Dec 01 10:23:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:58.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000072s
Dec 01 10:23:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:23:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:23:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:58.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:23:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:23:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:00 compute-2 ceph-mon[76053]: pgmap v1001: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 168 KiB/s rd, 7 op/s
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.277 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:24:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:00.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:24:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/857535630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.718 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:24:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:00.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.916 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.917 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5213MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.917 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.918 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.993 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:24:00 compute-2 nova_compute[230216]: 2025-12-01 10:24:00.994 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.016 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:24:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/857535630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:24:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3609595814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.497 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.503 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.527 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.528 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:24:01 compute-2 nova_compute[230216]: 2025-12-01 10:24:01.528 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:24:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:02 compute-2 ceph-mon[76053]: pgmap v1002: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 536 KiB/s rd, 2.4 MiB/s wr, 77 op/s
Dec 01 10:24:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3609595814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3653343762' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:02 compute-2 nova_compute[230216]: 2025-12-01 10:24:02.528 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:02.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:03 compute-2 sudo[239911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:24:03 compute-2 sudo[239911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:24:03 compute-2 sudo[239911]: pam_unix(sudo:session): session closed for user root
Dec 01 10:24:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2023787751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/297473408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:24:03 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:24:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:24:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:24:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:24:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:04.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:05 compute-2 ceph-mon[76053]: pgmap v1003: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 377 KiB/s rd, 2.4 MiB/s wr, 71 op/s
Dec 01 10:24:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3583663575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:05 compute-2 podman[239938]: 2025-12-01 10:24:05.404537726 +0000 UTC m=+0.059993893 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 01 10:24:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:06 compute-2 sudo[239959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:24:06 compute-2 sudo[239959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:24:06 compute-2 sudo[239959]: pam_unix(sudo:session): session closed for user root
Dec 01 10:24:06 compute-2 ceph-mon[76053]: pgmap v1004: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 377 KiB/s rd, 2.4 MiB/s wr, 71 op/s
Dec 01 10:24:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:06.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:06.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3822714645' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:24:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3822714645' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:24:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:08 compute-2 ceph-mon[76053]: pgmap v1005: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 370 KiB/s rd, 2.4 MiB/s wr, 70 op/s
Dec 01 10:24:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:08.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:08.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:09 compute-2 podman[239986]: 2025-12-01 10:24:09.403321738 +0000 UTC m=+0.053658618 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 10:24:09 compute-2 ceph-mon[76053]: pgmap v1006: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 328 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 01 10:24:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:10 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:10.170 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:24:10 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:10.171 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:24:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:24:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:24:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:10.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:12 compute-2 ceph-mon[76053]: pgmap v1007: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 01 10:24:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:12.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:14 compute-2 ceph-mon[76053]: pgmap v1008: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:24:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:14.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:14.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2923719981' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:16 compute-2 ceph-mon[76053]: pgmap v1009: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:24:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:16.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:16.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:18 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:24:18.174 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:24:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:18 compute-2 ceph-mon[76053]: pgmap v1010: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 17 KiB/s wr, 29 op/s
Dec 01 10:24:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:18.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:18.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:19 compute-2 podman[240017]: 2025-12-01 10:24:19.466385152 +0000 UTC m=+0.115391135 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:24:19 compute-2 ceph-mon[76053]: pgmap v1011: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 4.2 KiB/s wr, 28 op/s
Dec 01 10:24:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:20.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:20.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:21 compute-2 ceph-mon[76053]: pgmap v1012: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 4.2 KiB/s wr, 28 op/s
Dec 01 10:24:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:22.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:22.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:23 compute-2 ceph-mon[76053]: pgmap v1013: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 01 10:24:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:24.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:24:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:24.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:25 compute-2 ceph-mon[76053]: pgmap v1014: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 01 10:24:26 compute-2 sudo[240052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:24:26 compute-2 sudo[240052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:24:26 compute-2 sudo[240052]: pam_unix(sudo:session): session closed for user root
Dec 01 10:24:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:26.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:26.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:27 compute-2 ceph-mon[76053]: pgmap v1015: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:24:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:28.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:28.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:29 compute-2 ceph-mon[76053]: pgmap v1016: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:24:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:30.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:30.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:32 compute-2 ceph-mon[76053]: pgmap v1017: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:24:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:32.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:32.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:33 compute-2 ceph-mon[76053]: pgmap v1018: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:24:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:34.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:34.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:35 compute-2 ceph-mon[76053]: pgmap v1019: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:24:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:36 compute-2 podman[240087]: 2025-12-01 10:24:36.421673742 +0000 UTC m=+0.080181820 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 01 10:24:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:36.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:36.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:37 compute-2 ceph-mon[76053]: pgmap v1020: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:24:37 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3348997553' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:24:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:38.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:40 compute-2 ceph-mon[76053]: pgmap v1021: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:24:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:24:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:40 compute-2 podman[240111]: 2025-12-01 10:24:40.431379222 +0000 UTC m=+0.088189567 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 01 10:24:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:40.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:40.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:42 compute-2 ceph-mon[76053]: pgmap v1022: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:24:42 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4221105705' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:24:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:24:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:42.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:24:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:42.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:43 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1188300432' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:24:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:44 compute-2 ceph-mon[76053]: pgmap v1023: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:24:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:44.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:44.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:46 compute-2 ceph-mon[76053]: pgmap v1024: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:24:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:46 compute-2 sudo[240138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:24:46 compute-2 sudo[240138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:24:46 compute-2 sudo[240138]: pam_unix(sudo:session): session closed for user root
Dec 01 10:24:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:46.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:46.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:48 compute-2 ceph-mon[76053]: pgmap v1025: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 720 KiB/s rd, 1.8 MiB/s wr, 61 op/s
Dec 01 10:24:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:48.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:48.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:50 compute-2 ceph-mon[76053]: pgmap v1026: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 720 KiB/s rd, 1.8 MiB/s wr, 60 op/s
Dec 01 10:24:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:50 compute-2 podman[240167]: 2025-12-01 10:24:50.433220132 +0000 UTC m=+0.085974383 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 10:24:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:50.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:52 compute-2 ceph-mon[76053]: pgmap v1027: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 01 10:24:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:52.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:52.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:53 compute-2 nova_compute[230216]: 2025-12-01 10:24:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:54 compute-2 ceph-mon[76053]: pgmap v1028: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:24:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:54.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:54.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:55 compute-2 nova_compute[230216]: 2025-12-01 10:24:55.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:24:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:24:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:56.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:56 compute-2 ceph-mon[76053]: pgmap v1029: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:24:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:24:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:56.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:24:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:57 compute-2 ceph-mon[76053]: pgmap v1030: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.228 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:58 compute-2 nova_compute[230216]: 2025-12-01 10:24:58.229 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 10:24:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:24:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:24:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:24:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:24:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:58.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:24:59 compute-2 nova_compute[230216]: 2025-12-01 10:24:59.217 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:59 compute-2 nova_compute[230216]: 2025-12-01 10:24:59.218 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:24:59 compute-2 nova_compute[230216]: 2025-12-01 10:24:59.218 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:24:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:24:59 compute-2 ceph-mon[76053]: pgmap v1031: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.2 MiB/s rd, 41 op/s
Dec 01 10:24:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1287002598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:00.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:00.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:25:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:25:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1816831964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.669 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.821 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.822 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5220MB free_disk=59.92213439941406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.823 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.823 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:25:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:01 compute-2 ceph-mon[76053]: pgmap v1032: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 3.9 MiB/s wr, 131 op/s
Dec 01 10:25:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1816831964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.931 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.932 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:25:01 compute-2 nova_compute[230216]: 2025-12-01 10:25:01.980 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.069 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.070 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.085 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.104 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.126 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:25:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:25:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/911651656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.577 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.583 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.597 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.599 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.599 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.600 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.600 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.610 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 10:25:02 compute-2 nova_compute[230216]: 2025-12-01 10:25:02.610 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:02.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/911651656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:03.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:03 compute-2 sudo[240251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:25:03 compute-2 sudo[240251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:03 compute-2 sudo[240251]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:03 compute-2 sudo[240276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:25:03 compute-2 sudo[240276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:03 compute-2 nova_compute[230216]: 2025-12-01 10:25:03.619 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:03 compute-2 nova_compute[230216]: 2025-12-01 10:25:03.619 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:03 compute-2 ceph-mon[76053]: pgmap v1033: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 343 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Dec 01 10:25:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3703375154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2593606693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:04 compute-2 sudo[240276]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:04.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.713 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:25:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:25:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:25:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: pgmap v1034: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 354 KiB/s rd, 4.0 MiB/s wr, 94 op/s
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3128697381' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4127506147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3837439768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/243036720' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:25:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:05.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:06 compute-2 sudo[240334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:25:06 compute-2 sudo[240334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:06 compute-2 sudo[240334]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:06 compute-2 podman[240358]: 2025-12-01 10:25:06.648509675 +0000 UTC m=+0.054792646 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:25:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:06.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:07.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:25:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:25:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:25:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:25:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:07 compute-2 ceph-mon[76053]: pgmap v1035: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 354 KiB/s rd, 4.0 MiB/s wr, 95 op/s
Dec 01 10:25:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:25:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:25:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:08 compute-2 ceph-mon[76053]: pgmap v1036: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 352 KiB/s rd, 4.0 MiB/s wr, 94 op/s
Dec 01 10:25:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:08.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:09.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:25:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:10 compute-2 sudo[240382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:25:10 compute-2 sudo[240382]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:10 compute-2 sudo[240382]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:10 compute-2 podman[240406]: 2025-12-01 10:25:10.568550822 +0000 UTC m=+0.052502689 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:25:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:10.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:11.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:25:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:25:11 compute-2 ceph-mon[76053]: pgmap v1037: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 352 KiB/s rd, 4.0 MiB/s wr, 94 op/s
Dec 01 10:25:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:12 compute-2 ceph-mon[76053]: pgmap v1038: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 28 KiB/s wr, 77 op/s
Dec 01 10:25:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:12.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:13.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:14 compute-2 ceph-mon[76053]: pgmap v1039: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 28 KiB/s wr, 77 op/s
Dec 01 10:25:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:15.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:16.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:17.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:17 compute-2 ceph-mon[76053]: pgmap v1040: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 27 KiB/s wr, 75 op/s
Dec 01 10:25:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:18.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:18 compute-2 ceph-mon[76053]: pgmap v1041: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 74 op/s
Dec 01 10:25:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:19.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:20 compute-2 ceph-mon[76053]: pgmap v1042: 353 pgs: 353 active+clean; 167 MiB data, 331 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 74 op/s
Dec 01 10:25:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:20.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:21 compute-2 podman[240437]: 2025-12-01 10:25:21.442586281 +0000 UTC m=+0.099361689 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 01 10:25:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:22.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:24 compute-2 ceph-mon[76053]: pgmap v1043: 353 pgs: 353 active+clean; 200 MiB data, 360 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 137 op/s
Dec 01 10:25:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:25:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:24.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:25:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:25.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:25 compute-2 ceph-mon[76053]: pgmap v1044: 353 pgs: 353 active+clean; 200 MiB data, 360 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 01 10:25:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:25:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:26 compute-2 sudo[240470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:25:26 compute-2 sudo[240470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:26 compute-2 sudo[240470]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:26.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:27 compute-2 ceph-mon[76053]: pgmap v1045: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:25:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:28.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:29.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:29 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.079 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:25:29 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.081 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:25:29 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.081 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:25:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:29 compute-2 ceph-mon[76053]: pgmap v1046: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:25:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000047s ======
Dec 01 10:25:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:30.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Dec 01 10:25:30 compute-2 ceph-mon[76053]: pgmap v1047: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:25:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:32.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:33.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:33 compute-2 ceph-mon[76053]: pgmap v1048: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 333 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:25:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:34.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:34 compute-2 ceph-mon[76053]: pgmap v1049: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.5 KiB/s rd, 14 KiB/s wr, 2 op/s
Dec 01 10:25:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:25:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:35.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:25:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:36 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4000332064' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:36.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:37.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:37 compute-2 ceph-mon[76053]: pgmap v1050: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 22 KiB/s wr, 30 op/s
Dec 01 10:25:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:37 compute-2 podman[240505]: 2025-12-01 10:25:37.407483906 +0000 UTC m=+0.061739435 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 01 10:25:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:38.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:38 compute-2 ceph-mon[76053]: pgmap v1051: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 10 KiB/s wr, 29 op/s
Dec 01 10:25:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:25:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:25:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:25:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:41 compute-2 ceph-mon[76053]: pgmap v1052: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 10 KiB/s wr, 29 op/s
Dec 01 10:25:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1783100042' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:25:41 compute-2 podman[240530]: 2025-12-01 10:25:41.392349876 +0000 UTC m=+0.054166939 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 01 10:25:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:42.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:43.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:44 compute-2 ceph-mon[76053]: pgmap v1053: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 45 KiB/s rd, 12 KiB/s wr, 57 op/s
Dec 01 10:25:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:25:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:44.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:25:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:45.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:45 compute-2 ceph-mon[76053]: pgmap v1054: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 9.3 KiB/s wr, 56 op/s
Dec 01 10:25:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:46.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:46 compute-2 sudo[240556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:25:46 compute-2 sudo[240556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:25:46 compute-2 sudo[240556]: pam_unix(sudo:session): session closed for user root
Dec 01 10:25:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:47.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:47 compute-2 ceph-mon[76053]: pgmap v1055: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 9.3 KiB/s wr, 56 op/s
Dec 01 10:25:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:48 compute-2 ceph-mon[76053]: pgmap v1056: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:25:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:48.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:49.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:50 compute-2 ceph-mon[76053]: pgmap v1057: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:25:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:50.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:25:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:25:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:52 compute-2 podman[240587]: 2025-12-01 10:25:52.443021912 +0000 UTC m=+0.092121522 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:25:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:52.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:52 compute-2 ceph-mon[76053]: pgmap v1058: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:25:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:53.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:53 compute-2 nova_compute[230216]: 2025-12-01 10:25:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:54 compute-2 ceph-mon[76053]: pgmap v1059: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:25:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:54.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:55.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:55 compute-2 nova_compute[230216]: 2025-12-01 10:25:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:25:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:25:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:56 compute-2 ceph-mon[76053]: pgmap v1060: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:25:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:56.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:57 compute-2 nova_compute[230216]: 2025-12-01 10:25:57.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:58 compute-2 ceph-mon[76053]: pgmap v1061: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:25:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:58.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:25:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:25:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:59.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:25:59 compute-2 nova_compute[230216]: 2025-12-01 10:25:59.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:25:59 compute-2 nova_compute[230216]: 2025-12-01 10:25:59.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:25:59 compute-2 nova_compute[230216]: 2025-12-01 10:25:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:25:59 compute-2 nova_compute[230216]: 2025-12-01 10:25:59.222 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:25:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:25:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:00 compute-2 nova_compute[230216]: 2025-12-01 10:26:00.216 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:00 compute-2 ceph-mon[76053]: pgmap v1062: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:26:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:00.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:01 compute-2 nova_compute[230216]: 2025-12-01 10:26:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:01 compute-2 nova_compute[230216]: 2025-12-01 10:26:01.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:01 compute-2 nova_compute[230216]: 2025-12-01 10:26:01.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:26:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:26:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:26:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/413638406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.698 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:26:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:02.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.857 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.858 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5224MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.858 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.859 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.920 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.920 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:26:02 compute-2 nova_compute[230216]: 2025-12-01 10:26:02.944 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:26:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:26:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3317222816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:03 compute-2 nova_compute[230216]: 2025-12-01 10:26:03.379 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:26:03 compute-2 nova_compute[230216]: 2025-12-01 10:26:03.386 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:26:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:03 compute-2 nova_compute[230216]: 2025-12-01 10:26:03.405 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:26:03 compute-2 nova_compute[230216]: 2025-12-01 10:26:03.406 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:26:03 compute-2 nova_compute[230216]: 2025-12-01 10:26:03.407 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:26:03 compute-2 ceph-mon[76053]: pgmap v1063: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:26:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/413638406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2558204076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3317222816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:04 compute-2 nova_compute[230216]: 2025-12-01 10:26:04.407 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:04 compute-2 nova_compute[230216]: 2025-12-01 10:26:04.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:04 compute-2 ceph-mon[76053]: pgmap v1064: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:26:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:26:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:26:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:26:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:04.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:05.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1239419402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3729794993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/866047266' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3871086598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:06 compute-2 ceph-mon[76053]: pgmap v1065: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:26:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:06.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:06 compute-2 sudo[240674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:26:06 compute-2 sudo[240674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:06 compute-2 sudo[240674]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:26:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:26:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:26:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:26:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:07.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:26:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:26:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:08 compute-2 podman[240701]: 2025-12-01 10:26:08.386446694 +0000 UTC m=+0.049438195 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 01 10:26:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1623528846' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:26:08 compute-2 ceph-mon[76053]: pgmap v1066: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:26:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1760247541' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 01 10:26:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:09.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:26:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:10 compute-2 sudo[240722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:26:10 compute-2 sudo[240722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:10 compute-2 sudo[240722]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:10 compute-2 ceph-mon[76053]: pgmap v1067: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 01 10:26:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:10 compute-2 sudo[240747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:26:10 compute-2 sudo[240747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:10.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:11.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:11 compute-2 sudo[240747]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:12 compute-2 podman[240806]: 2025-12-01 10:26:12.410883503 +0000 UTC m=+0.060342282 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 01 10:26:12 compute-2 ceph-mon[76053]: pgmap v1068: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:26:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:12.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:13.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:14 compute-2 ceph-mon[76053]: pgmap v1069: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Dec 01 10:26:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:14.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:15.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:15 compute-2 sshd-session[240827]: Received disconnect from 45.78.219.119 port 41224:11: Bye Bye [preauth]
Dec 01 10:26:15 compute-2 sshd-session[240827]: Disconnected from authenticating user root 45.78.219.119 port 41224 [preauth]
Dec 01 10:26:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:26:15 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:26:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:16.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:16 compute-2 ceph-mon[76053]: pgmap v1070: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.9 MiB/s wr, 107 op/s
Dec 01 10:26:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:26:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:17.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:26:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:18.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:18 compute-2 ceph-mon[76053]: pgmap v1071: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 01 10:26:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:19.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:20.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:21 compute-2 ceph-mon[76053]: pgmap v1072: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 01 10:26:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:21.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:21 compute-2 sudo[240839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:26:21 compute-2 sudo[240839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:21 compute-2 sudo[240839]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:22 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:26:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:22.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:23.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:23 compute-2 ceph-mon[76053]: pgmap v1073: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 01 10:26:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:23 compute-2 podman[240864]: 2025-12-01 10:26:23.417166401 +0000 UTC m=+0.077381749 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:26:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:24.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:25.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:25 compute-2 ceph-mon[76053]: pgmap v1074: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 68 op/s
Dec 01 10:26:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:26:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:26 compute-2 sudo[240894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:26:26 compute-2 sudo[240894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:26 compute-2 sudo[240894]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:27.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:27 compute-2 ceph-mon[76053]: pgmap v1075: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 2.4 MiB/s rd, 2.3 MiB/s wr, 135 op/s
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.496930) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787497096, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2370, "num_deletes": 251, "total_data_size": 6260895, "memory_usage": 6355248, "flush_reason": "Manual Compaction"}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787524786, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4044953, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31302, "largest_seqno": 33667, "table_properties": {"data_size": 4035372, "index_size": 6011, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20031, "raw_average_key_size": 20, "raw_value_size": 4016184, "raw_average_value_size": 4110, "num_data_blocks": 258, "num_entries": 977, "num_filter_entries": 977, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584581, "oldest_key_time": 1764584581, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 27938 microseconds, and 15418 cpu microseconds.
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.524878) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4044953 bytes OK
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.524907) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526444) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526465) EVENT_LOG_v1 {"time_micros": 1764584787526457, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526492) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6250450, prev total WAL file size 6250450, number of live WAL files 2.
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.528948) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3950KB)], [60(12MB)]
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787529082, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16801434, "oldest_snapshot_seqno": -1}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6272 keys, 14621943 bytes, temperature: kUnknown
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787615918, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14621943, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14580421, "index_size": 24772, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 160553, "raw_average_key_size": 25, "raw_value_size": 14467773, "raw_average_value_size": 2306, "num_data_blocks": 995, "num_entries": 6272, "num_filter_entries": 6272, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.616559) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14621943 bytes
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.618370) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.6 rd, 167.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.2 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 6793, records dropped: 521 output_compression: NoCompression
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.618411) EVENT_LOG_v1 {"time_micros": 1764584787618392, "job": 36, "event": "compaction_finished", "compaction_time_micros": 87238, "compaction_time_cpu_micros": 40669, "output_level": 6, "num_output_files": 1, "total_output_size": 14621943, "num_input_records": 6793, "num_output_records": 6272, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787620822, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787626261, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.528791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:26:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:28 compute-2 ceph-mon[76053]: pgmap v1076: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:26:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:28.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:29.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:30.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:30 compute-2 ceph-mon[76053]: pgmap v1077: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:26:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:31.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:32.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:32 compute-2 ceph-mon[76053]: pgmap v1078: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:26:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:33.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:34.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:34 compute-2 ceph-mon[76053]: pgmap v1079: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 01 10:26:34 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:34.844 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:26:34 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:34.845 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:26:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:35.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:36.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:37 compute-2 ceph-mon[76053]: pgmap v1080: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 01 10:26:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:37.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:39 compute-2 ceph-mon[76053]: pgmap v1081: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:26:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:39.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:39 compute-2 podman[240931]: 2025-12-01 10:26:39.406007405 +0000 UTC m=+0.056663420 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:26:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:26:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:40.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:41.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:41 compute-2 ceph-mon[76053]: pgmap v1082: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Dec 01 10:26:41 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1709413327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:26:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:42.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:43.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:43 compute-2 ceph-mon[76053]: pgmap v1083: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 17 KiB/s wr, 29 op/s
Dec 01 10:26:43 compute-2 podman[240955]: 2025-12-01 10:26:43.396539152 +0000 UTC m=+0.054666062 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 01 10:26:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:44 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:26:44.847 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:26:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:45.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:45 compute-2 ceph-mon[76053]: pgmap v1084: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.8 KiB/s wr, 28 op/s
Dec 01 10:26:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:47 compute-2 sudo[240979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:26:47 compute-2 sudo[240979]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:26:47 compute-2 sudo[240979]: pam_unix(sudo:session): session closed for user root
Dec 01 10:26:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:47 compute-2 ceph-mon[76053]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 5.8 KiB/s wr, 28 op/s
Dec 01 10:26:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:49 compute-2 ceph-mon[76053]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:26:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:50.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:51 compute-2 ceph-mon[76053]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:26:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:26:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:52.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:26:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:53 compute-2 ceph-mon[76053]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 01 10:26:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:54 compute-2 podman[241012]: 2025-12-01 10:26:54.425482844 +0000 UTC m=+0.087933679 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 10:26:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:54.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:55 compute-2 nova_compute[230216]: 2025-12-01 10:26:55.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:55 compute-2 nova_compute[230216]: 2025-12-01 10:26:55.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:55 compute-2 ceph-mon[76053]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:26:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:26:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:26:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:56.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:57 compute-2 ceph-mon[76053]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:26:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:26:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:58.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:26:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:26:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:26:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:26:59 compute-2 nova_compute[230216]: 2025-12-01 10:26:59.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:26:59 compute-2 nova_compute[230216]: 2025-12-01 10:26:59.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:26:59 compute-2 nova_compute[230216]: 2025-12-01 10:26:59.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:26:59 compute-2 nova_compute[230216]: 2025-12-01 10:26:59.230 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:26:59 compute-2 ceph-mon[76053]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:26:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:26:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:00 compute-2 nova_compute[230216]: 2025-12-01 10:27:00.222 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:00.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:01 compute-2 nova_compute[230216]: 2025-12-01 10:27:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:01 compute-2 ceph-mon[76053]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:02 compute-2 nova_compute[230216]: 2025-12-01 10:27:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:02 compute-2 nova_compute[230216]: 2025-12-01 10:27:02.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:27:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:03.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:03 compute-2 ceph-mon[76053]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:27:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:27:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2307304294' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.667 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:27:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.715 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:27:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.715 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:27:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.716 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:27:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:27:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:04.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.830 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5221MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:27:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.910 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.910 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:27:04 compute-2 nova_compute[230216]: 2025-12-01 10:27:04.928 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:27:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:27:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:27:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:27:05 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1174345469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:05 compute-2 nova_compute[230216]: 2025-12-01 10:27:05.387 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:27:05 compute-2 nova_compute[230216]: 2025-12-01 10:27:05.392 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:27:05 compute-2 nova_compute[230216]: 2025-12-01 10:27:05.410 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:27:05 compute-2 nova_compute[230216]: 2025-12-01 10:27:05.411 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:27:05 compute-2 nova_compute[230216]: 2025-12-01 10:27:05.411 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:27:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:05 compute-2 ceph-mon[76053]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2307304294' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1174345469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:06 compute-2 ceph-mon[76053]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:06.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:07 compute-2 sudo[241094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:27:07 compute-2 sudo[241094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:07 compute-2 sudo[241094]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:07.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3971070539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1614089083' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:27:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1614089083' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:27:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/109927610' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2876913939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:08 compute-2 ceph-mon[76053]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/662702271' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:27:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:08.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:09.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:27:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:10 compute-2 podman[241123]: 2025-12-01 10:27:10.39979085 +0000 UTC m=+0.052698876 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 01 10:27:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:10 compute-2 ceph-mon[76053]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:10.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:11.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:12.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:13.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:13 compute-2 ceph-mon[76053]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:14 compute-2 podman[241146]: 2025-12-01 10:27:14.401616333 +0000 UTC m=+0.057515853 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:27:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:14 compute-2 ceph-mon[76053]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:14.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:15.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:16 compute-2 ceph-mon[76053]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:16.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:18.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:18 compute-2 ceph-mon[76053]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:19.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:20.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:21 compute-2 ceph-mon[76053]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:21 compute-2 sudo[241176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:27:21 compute-2 sudo[241176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:21 compute-2 sudo[241176]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:21 compute-2 sudo[241201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:27:21 compute-2 sudo[241201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:22 compute-2 sudo[241201]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:22.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:23.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:23 compute-2 ceph-mon[76053]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:27:23 compute-2 ceph-mon[76053]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 575 B/s rd, 0 op/s
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:27:23 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:27:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:24 compute-2 ceph-mon[76053]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 576 B/s rd, 0 op/s
Dec 01 10:27:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:27:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:24.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:25.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:25 compute-2 podman[241258]: 2025-12-01 10:27:25.432083164 +0000 UTC m=+0.090635847 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 10:27:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:26.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:27 compute-2 ceph-mon[76053]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 576 B/s rd, 0 op/s
Dec 01 10:27:27 compute-2 sudo[241287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:27:27 compute-2 sudo[241287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:27 compute-2 sudo[241287]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:28.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:29 compute-2 sudo[241313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:27:29 compute-2 sudo[241313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:29 compute-2 sudo[241313]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:29 compute-2 ceph-mon[76053]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 576 B/s rd, 0 op/s
Dec 01 10:27:29 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:27:29 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:27:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:30 compute-2 ceph-mon[76053]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 576 B/s rd, 0 op/s
Dec 01 10:27:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:31.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:32.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:33 compute-2 ceph-mon[76053]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 576 B/s rd, 0 op/s
Dec 01 10:27:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:34 compute-2 ceph-mon[76053]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:34.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:36 compute-2 ceph-mon[76053]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:36.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:37.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:38 compute-2 ceph-mon[76053]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:38.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:39.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:27:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:40 compute-2 ceph-mon[76053]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:40.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:41.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:41 compute-2 podman[241350]: 2025-12-01 10:27:41.387585348 +0000 UTC m=+0.049362773 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 01 10:27:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:42.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:43.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:43 compute-2 ceph-mon[76053]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:44 compute-2 ceph-mon[76053]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:44.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:45.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:45 compute-2 podman[241373]: 2025-12-01 10:27:45.39945286 +0000 UTC m=+0.057708768 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:27:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:46 compute-2 ceph-mon[76053]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:47.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:47 compute-2 sudo[241398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:27:47 compute-2 sudo[241398]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:27:47 compute-2 sudo[241398]: pam_unix(sudo:session): session closed for user root
Dec 01 10:27:47 compute-2 sshd-session[241397]: Accepted publickey for zuul from 192.168.122.10 port 41272 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:27:47 compute-2 systemd-logind[795]: New session 55 of user zuul.
Dec 01 10:27:47 compute-2 systemd[1]: Started Session 55 of User zuul.
Dec 01 10:27:47 compute-2 sshd-session[241397]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:27:47 compute-2 sudo[241426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 01 10:27:47 compute-2 sudo[241426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:27:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:48 compute-2 ceph-mon[76053]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:48.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:49.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:50 compute-2 ceph-mon[76053]: from='client.25555 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:50 compute-2 ceph-mon[76053]: from='client.26048 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:50 compute-2 ceph-mon[76053]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:50 compute-2 ceph-mon[76053]: from='client.16311 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:50.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 01 10:27:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2261726682' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:27:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:51.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.25564 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.26057 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.16317 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2261726682' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2639347518' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:27:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2039892442' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:27:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:52 compute-2 ceph-mon[76053]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:52.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:53.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:54 compute-2 ceph-mon[76053]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:54.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:27:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:55.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:27:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:27:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:27:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:56 compute-2 ovs-vsctl[241838]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 01 10:27:56 compute-2 podman[241810]: 2025-12-01 10:27:56.446448267 +0000 UTC m=+0.091307963 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:27:56 compute-2 ceph-mon[76053]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:27:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:56.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:57.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:57 compute-2 nova_compute[230216]: 2025-12-01 10:27:57.414 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:57 compute-2 nova_compute[230216]: 2025-12-01 10:27:57.415 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:57 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 01 10:27:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:57 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 01 10:27:57 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 10:27:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: cache status {prefix=cache status} (starting...)
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: client ls {prefix=client ls} (starting...)
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:58 compute-2 lvm[242181]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 10:27:58 compute-2 lvm[242181]: VG ceph_vg0 finished
Dec 01 10:27:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:58 compute-2 ceph-mon[76053]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: damage ls {prefix=damage ls} (starting...)
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:58.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump loads {prefix=dump loads} (starting...)
Dec 01 10:27:58 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec 01 10:27:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2329677152' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 nova_compute[230216]: 2025-12-01 10:27:59.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:27:59 compute-2 nova_compute[230216]: 2025-12-01 10:27:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:27:59 compute-2 nova_compute[230216]: 2025-12-01 10:27:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:27:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:27:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:59.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:27:59 compute-2 nova_compute[230216]: 2025-12-01 10:27:59.346 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 01 10:27:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2360835235' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.25576 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.26069 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2329677152' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/183974291' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.26084 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.25588 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2360835235' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3111091241' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:27:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec 01 10:27:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1721795399' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:27:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 01 10:27:59 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:28:00 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: ops {prefix=ops} (starting...)
Dec 01 10:28:00 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:28:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 01 10:28:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3248093188' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:28:00 compute-2 nova_compute[230216]: 2025-12-01 10:28:00.339 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 01 10:28:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2599300060' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.25600 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.26096 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1721795399' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3584704780' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.16338 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.26108 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.25612 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3517039617' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3248093188' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3608231201' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.16356 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3149209817' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3941467253' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:28:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:00 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: session ls {prefix=session ls} (starting...)
Dec 01 10:28:00 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:28:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:00.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:00 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: status {prefix=status} (starting...)
Dec 01 10:28:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 01 10:28:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1759886200' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:01.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 01 10:28:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/403042370' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec 01 10:28:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/393143358' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2599300060' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.25630 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.26138 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.16368 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1179325203' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1656107208' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1759886200' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.25645 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.26150 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.16383 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2521562130' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3403051108' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/403042370' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 01 10:28:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1218432900' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 01 10:28:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/993981221' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:28:02 compute-2 nova_compute[230216]: 2025-12-01 10:28:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:02 compute-2 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:02 compute-2 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:02 compute-2 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:28:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 01 10:28:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/396734622' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/393143358' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2807277753' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2495423836' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.16401 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1042598284' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1218432900' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/993981221' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1749002467' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/298894126' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.16419 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/396734622' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2552589729' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.25687 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/767115279' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:02.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 01 10:28:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/501556868' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 01 10:28:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279321811' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:03.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 01 10:28:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3712036744' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.26198 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1199932636' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1545124193' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4102015110' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/501556868' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2634542928' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1812810523' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1457628468' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/696210361' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4279321811' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3712036744' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/433939349' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/415455245' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:28:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:04 compute-2 nova_compute[230216]: 2025-12-01 10:28:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2398960754' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:19.820710+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69705728 unmapped: 425984 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:20.820888+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:21.821042+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:22.821204+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:23.821358+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829934 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 401408 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:24.821539+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 401408 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:25.821718+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69746688 unmapped: 385024 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:26.821884+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 376832 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:27.822046+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 376832 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:28.822252+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829343 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 344064 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:29.822487+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 335872 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:30.822685+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 311296 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.601384163s of 13.612139702s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:31.822879+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 303104 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:32.823035+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 303104 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:33.823291+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 278528 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:34.823452+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 278528 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:35.823903+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 262144 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:36.824075+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 253952 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:37.824310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 253952 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:38.824658+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:39.825077+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:40.825254+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:41.825489+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69894144 unmapped: 237568 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:42.825717+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:43.825888+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563657972780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:44.826191+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:45.826331+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:46.826525+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:47.826791+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:48.827047+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 196608 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:49.827212+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 196608 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:50.827359+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 180224 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:51.827693+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 180224 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:52.827988+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 172032 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:53.828227+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:54.828392+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:55.828538+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:56.828737+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69976064 unmapped: 155648 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:57.828914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.784265518s of 26.792785645s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69976064 unmapped: 155648 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:58.829102+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830264 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 147456 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:55:59.829271+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 147456 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:00.829438+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:01.829691+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:02.829841+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:03.829996+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70008832 unmapped: 122880 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:04.830162+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 114688 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:05.830396+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 114688 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:06.830617+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 106496 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:07.830844+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 98304 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:08.831081+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70041600 unmapped: 90112 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:09.831304+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 81920 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:10.831532+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 65536 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:11.831831+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:12.832041+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:13.832350+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:14.832574+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 49152 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:15.832886+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 49152 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:16.833095+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 40960 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:17.833346+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 40960 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:18.833568+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:19.833785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:20.833952+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:21.834123+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70115328 unmapped: 16384 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:22.834287+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 8192 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:23.834512+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70131712 unmapped: 0 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:24.834713+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70139904 unmapped: 1040384 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:25.835030+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70156288 unmapped: 1024000 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:26.835232+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70156288 unmapped: 1024000 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:27.835535+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70164480 unmapped: 1015808 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:28.835722+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 1007616 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:29.836009+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 999424 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:30.836277+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 991232 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:31.836558+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 991232 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:32.836815+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 983040 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:33.837121+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 983040 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:34.837382+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:35.837675+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:36.838041+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:37.838273+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70213632 unmapped: 966656 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:38.838494+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:39.838780+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:40.838930+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:41.839845+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:42.840120+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 950272 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:43.840299+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 942080 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:44.840450+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 942080 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:45.840617+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:46.840785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:47.840945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:48.841089+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70262784 unmapped: 917504 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:49.841249+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636577f12c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70262784 unmapped: 917504 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:50.841390+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 909312 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:51.841800+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 909312 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:52.841945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 901120 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:53.842088+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 892928 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:54.842225+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 884736 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:55.842391+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 876544 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:56.842731+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 876544 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:57.842902+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 868352 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:58.843089+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 868352 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:56:59.843290+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:00.843431+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:01.843674+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:02.843909+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 843776 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:03.844138+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 843776 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:04.844288+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 835584 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:05.844441+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:06.844683+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:07.844966+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:08.845156+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 811008 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:09.845370+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.542472839s of 71.550804138s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 811008 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:10.845654+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:11.845885+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:12.846115+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:13.846337+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 794624 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:14.846479+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 786432 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:15.846644+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 778240 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:16.846786+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 778240 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:17.846959+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563656b24b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 770048 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:18.847109+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 761856 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:19.847287+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 753664 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:20.847425+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:21.847634+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:22.847809+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:23.848105+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:24.848303+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:25.848472+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:26.848642+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 729088 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:27.848777+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 720896 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:28.848938+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:29.849091+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:30.849264+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:31.849511+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.452596664s of 22.460166931s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 704512 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:32.849719+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 704512 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:33.849875+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 832106 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70483968 unmapped: 696320 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:34.850021+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 688128 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:35.850206+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:36.850352+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:37.850569+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:38.850780+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 671744 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:39.850930+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70524928 unmapped: 655360 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:40.851090+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70524928 unmapped: 655360 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:41.851293+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 647168 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:42.851456+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 638976 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:43.851697+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 630784 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:44.851856+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:45.852052+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 630784 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:46.852307+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:47.852572+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:48.852785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:49.852967+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:50.853121+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:51.853418+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:52.853641+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:53.853796+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:54.853945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:55.854195+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:56.854483+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:57.854732+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:58.854932+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 589824 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:57:59.855207+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 581632 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:00.855434+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 565248 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:01.855676+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 557056 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:02.855823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 557056 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:03.856008+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:04.856179+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:05.856391+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:06.856574+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 540672 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:07.857011+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 532480 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:08.857193+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 532480 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:09.857402+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 524288 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:10.857566+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 524288 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:11.857857+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:12.858012+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:13.858202+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:14.858402+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 507904 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:15.858681+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 491520 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:16.858855+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:17.858948+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:18.859092+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:19.859263+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:20.859413+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:21.859620+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:22.859803+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:23.859967+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 466944 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:24.860119+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 466944 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:25.860301+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:26.860460+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:27.860697+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:28.860837+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 442368 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:29.861005+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 442368 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:30.861169+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 434176 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:31.861361+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 434176 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:32.861526+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:33.861697+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:34.861914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:35.862059+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:36.862228+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:37.862398+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:38.862654+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:39.862831+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:40.863005+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:41.863227+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 376832 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:42.863435+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 376832 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:43.863630+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 368640 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:44.863825+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 368640 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:45.863966+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 360448 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:46.864140+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:47.864323+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:48.864539+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:49.864669+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 344064 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:50.865645+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 344064 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:51.866817+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 335872 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:52.867235+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 335872 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:53.867584+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 319488 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.895309448s of 81.902931213s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836642 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:54.867766+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 311296 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:55.867948+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 311296 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:56.868440+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 303104 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:57.868661+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 294912 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:58.868823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 294912 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:58:59.868991+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836642 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 286720 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:00.869431+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 270336 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:01.869666+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:02.869804+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:03.869968+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:04.870196+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 253952 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:05.870497+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 253952 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:06.870658+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 245760 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:07.870797+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 245760 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:08.871278+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 237568 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:09.871433+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 237568 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:10.871787+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:11.871989+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:12.872192+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:13.872353+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 212992 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x563655c532c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:14.872649+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 212992 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:15.872903+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:16.873041+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:17.873221+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:18.873436+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:19.873666+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:20.873891+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:21.874166+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 188416 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:22.874365+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 188416 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:23.874542+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:24.874674+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:25.874826+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:26.875039+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 172032 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Cumulative writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s
                                           Interval WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:27.875191+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 106496 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:28.875333+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 98304 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:29.875544+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 98304 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:30.875681+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:31.875890+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.885166168s of 37.986934662s, submitted: 4
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:32.876074+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:33.876245+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:34.876382+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836972 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:35.876552+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:36.876774+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:37.877033+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:38.877272+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:39.877490+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:40.877678+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:41.877965+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:42.878137+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:43.878310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:44.878559+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:45.878699+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:46.878875+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:47.879141+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:48.879278+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:49.879428+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 49152 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210c00 session 0x56365861cd20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:50.879631+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 49152 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:51.879832+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 40960 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:52.879956+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 40960 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:53.880809+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 32768 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:54.880988+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 32768 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:55.881492+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 24576 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:56.881656+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 16384 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:57.881844+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 8192 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:58.882107+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 8192 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T09:59:59.882324+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:00.882486+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:01.882673+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:02.882881+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:03.883047+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.038856506s of 32.052764893s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:04.883282+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838814 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:05.883873+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:06.884082+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:07.884227+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:08.884494+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:09.884809+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838814 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:10.884951+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1015808 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:11.885123+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1015808 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:12.885422+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1007616 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:13.885741+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 991232 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:14.885986+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 991232 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:15.886225+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:16.886369+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:17.886531+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:18.886683+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:19.886853+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:20.886983+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:21.887138+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:22.887265+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:23.887401+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 958464 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:24.887550+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 958464 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:25.887721+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:26.887886+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.957025528s of 22.967693329s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 1966080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:27.888014+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 1957888 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:28.889828+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 1949696 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:29.889952+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:30.890815+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:31.890999+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:32.891396+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:33.892809+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 1867776 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:34.893768+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 1851392 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:35.894892+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 1851392 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:36.895843+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:37.896914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:38.897721+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x563655b9af00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:39.897905+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:40.898695+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:41.898901+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:42.899503+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563655c530e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:43.899725+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:44.899979+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:45.900170+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:46.900400+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:47.900768+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 1818624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:48.900950+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 1818624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:49.901404+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 1810432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:50.901606+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 1810432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:51.901794+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 1802240 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:52.901947+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 1794048 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:53.902184+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 1785856 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:54.902427+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 1769472 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:55.902622+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 1769472 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:56.902852+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 1761280 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:57.902999+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 1744896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:58.903147+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.175931931s of 32.282161713s, submitted: 203
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 1744896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:00:59.903312+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837632 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 1736704 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:00.903540+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 1736704 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:01.903798+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 1728512 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:02.903956+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 1720320 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:03.904138+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 1720320 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:04.904347+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 1712128 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:05.904949+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 1712128 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:06.905255+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:07.905400+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:08.905642+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:09.905818+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 1695744 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:10.905981+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 1695744 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:11.906156+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 1687552 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:12.906310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 1687552 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:13.906484+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 1679360 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:14.906707+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 1654784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:15.906868+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563656b665a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 1654784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:16.907134+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 1646592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:17.907383+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 1646592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:18.907711+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:19.908034+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:20.908231+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:21.908455+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 1630208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:22.908676+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 1622016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:23.908840+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 1622016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:24.909045+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 1613824 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:25.909244+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 1613824 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:26.909406+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 1605632 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:27.909663+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 1597440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:28.909855+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 1597440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:29.910035+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.517019272s of 30.796197891s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840656 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:30.910234+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:31.910518+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:32.919233+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 1581056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:33.919425+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 1581056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:34.919695+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840656 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:35.919931+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:36.920138+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:37.920286+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:38.920502+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:39.920693+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:40.920868+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:41.921093+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:42.921535+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:43.921753+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:44.922641+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:45.922838+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:46.923043+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:47.923241+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:48.923407+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:49.923645+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:50.923806+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:51.924043+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:52.924261+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:53.924455+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:54.924673+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:55.924850+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:56.925036+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:57.925264+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:58.925486+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:01:59.925730+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:00.925957+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:01.926199+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:02.926363+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:03.926516+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:04.926716+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:05.926887+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:06.927033+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:07.927204+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636586885a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:08.927376+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:09.927559+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:10.927750+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:11.927974+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:12.928168+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:13.928345+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:14.928499+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:15.928657+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:16.928842+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:17.929026+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:18.929191+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:19.929542+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:20.929824+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:21.930135+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 51.468963623s of 51.715778351s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:22.930364+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:23.930624+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:24.930845+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365793e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:25.931064+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:26.931253+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:27.931546+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:28.931762+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:29.931976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:30.932184+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:31.932436+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:32.932660+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:33.932909+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:34.933087+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:35.933282+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:36.933439+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:37.933627+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:38.933773+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.819169998s of 17.039936066s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:39.933951+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846113 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:40.934112+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:41.934307+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:42.934517+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:43.934679+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:44.934871+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845522 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:45.935038+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:46.935179+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:47.935310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:48.935463+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:49.935631+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:50.935781+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:51.935955+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:52.936116+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:53.936276+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 1392640 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:54.936448+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 1392640 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:55.936705+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:56.937390+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:57.937568+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:58.937729+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:02:59.937912+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:00.938052+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:01.938227+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:02.938421+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:03.938576+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:04.938846+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:05.939026+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:06.939218+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:07.939463+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:08.939723+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:09.939934+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:10.940078+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:11.940261+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:12.940405+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:13.940567+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:14.940824+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:15.941140+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:16.941334+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:17.941479+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:18.941723+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:19.941955+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:20.942131+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:21.942449+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:22.942643+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:23.942849+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:24.943076+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:25.943296+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:26.943454+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:27.943657+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:28.943810+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:29.943982+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:30.944205+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:31.944463+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:32.944683+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x563655b9ab40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:33.944894+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 1335296 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:34.945063+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 1335296 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:35.945214+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:36.945384+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:37.945552+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:38.945667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:39.945843+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:40.946067+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:41.946248+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:42.946461+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:43.946739+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:44.946895+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:45.947040+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:46.947218+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:47.947407+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 68.849288940s of 68.914070129s, submitted: 4
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:48.947564+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:49.947736+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:50.947894+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846443 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:51.948157+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:52.948325+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:53.948497+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:54.948656+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:55.948835+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845852 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:56.948981+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:57.949180+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:58.949365+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:03:59.949545+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:00.949754+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:01.950014+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:02.950201+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:03.950367+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:04.950586+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:05.950789+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:06.951382+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:07.951560+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:08.952725+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:09.952859+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:10.953007+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:11.953219+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:12.953373+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:13.953518+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:14.953678+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:15.953817+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:16.954013+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:17.954215+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:18.954387+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:19.954543+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:20.954742+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:21.954959+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:22.955146+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:23.955303+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:24.955476+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:25.955667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:26.955892+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:27.956055+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:28.956210+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:29.956445+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:30.956628+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:31.956842+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:32.957008+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:33.957161+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 1228800 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:34.957342+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x56365861cd20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:35.957538+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:36.957764+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:37.958425+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365861de00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:38.958618+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:39.958778+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:40.958922+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:41.959151+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:42.959357+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:43.959503+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:44.959677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:45.959889+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:46.960074+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:47.960245+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:48.960414+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:49.960629+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 1204224 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:50.960875+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 1204224 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:51.961092+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:52.961283+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:53.961454+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:54.961635+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 67.279685974s of 67.465194702s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:55.961767+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846773 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:56.961969+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:57.962126+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:58.962299+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:04:59.962463+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:00.962697+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:01.962922+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:02.963074+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:03.963269+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:04.963429+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:05.963583+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:06.963728+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:07.963872+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:08.964039+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e5a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:09.964218+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874eb40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:10.964401+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:11.964630+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:12.964816+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:13.964986+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:14.965149+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:15.965319+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:16.965491+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:17.965687+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:18.967663+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:19.967840+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:20.968048+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:21.968275+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:22.968446+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:23.968630+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:24.968814+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:25.969015+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:26.969241+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:27.969451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:28.969649+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:29.969857+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:30.970021+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:31.970240+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:32.970454+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.810539246s of 37.817691803s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:33.970689+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 1146880 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:34.970835+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x5636586f0f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:35.971068+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:36.971219+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:37.971385+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:38.971524+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:39.971908+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:40.972048+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:41.972276+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:42.972428+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:43.972948+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:44.973202+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:45.973454+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:46.973678+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:47.973928+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:48.974149+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.038652420s of 16.043939590s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:49.974284+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:50.974445+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:51.974654+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:52.974865+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:53.975184+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:54.975363+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:55.975523+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:56.975718+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:57.975868+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:58.976086+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:59.976340+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:00.976576+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:01.976853+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:02.977247+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:03.977445+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:04.977589+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:05.977823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:06.978016+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:07.978201+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:08.978379+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:09.978542+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:10.978738+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:11.979024+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:12.979186+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:13.979369+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:14.979530+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:15.979744+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:16.979899+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:17.980212+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:18.981307+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:19.982664+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:20.984079+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:21.984296+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:22.984447+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:23.984644+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:24.985095+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:25.985260+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:26.985425+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:27.985572+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:28.986105+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:29.986293+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:30.986481+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:31.987004+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:32.987190+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:33.987645+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:34.987972+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:35.988148+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:36.988460+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:37.988662+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:38.988830+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:39.989052+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:40.989338+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:41.989550+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:42.989762+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:43.989903+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:44.990050+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:45.990236+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:46.990705+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:47.990878+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:48.991159+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636586f12c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:49.991492+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:50.991686+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:51.991850+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:52.992020+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:53.992190+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:54.992356+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:55.992538+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:56.992707+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:57.992904+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:58.993035+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:59.993211+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:00.993354+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:01.993543+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:02.993730+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 74.315505981s of 74.319114685s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:03.993893+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:04.994092+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:05.994313+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:06.994478+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:07.994691+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:08.994839+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:09.995039+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:10.995230+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:11.995426+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:12.995607+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:13.995851+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:14.996093+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:15.996501+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:16.996670+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.970045090s of 13.995903015s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:17.996836+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:18.997017+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:19.997190+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:20.997394+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:21.997679+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:22.997880+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:23.998050+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:24.998193+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:25.998349+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:26.998495+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:27.998653+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:28.998832+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:29.999005+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:30.999152+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:31.999330+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:32.999493+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:33.999681+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563655b9ab40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:34.999857+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:36.000002+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:37.000149+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:38.000485+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:39.000679+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:40.000898+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:41.001121+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:42.001377+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:43.001627+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:44.001794+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:45.002006+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:46.002155+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:47.002302+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:48.002476+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:49.002686+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:50.002864+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.174911499s of 33.210174561s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:51.003066+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:52.003265+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:53.003423+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:54.003636+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:55.003793+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:56.003976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:57.004136+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:58.004305+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:59.004481+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:00.004669+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:01.004816+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:02.005060+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851969 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:03.005232+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.003420830s of 13.025437355s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:04.005422+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:05.005661+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:06.005871+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:07.006079+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:08.006275+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:09.006480+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:10.006643+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:11.006887+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:12.007112+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:13.007292+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:14.007482+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:15.007664+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:16.007926+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874f680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:17.008230+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:18.008469+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:19.008671+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:20.008862+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:21.009051+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:22.009255+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:23.009428+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:24.009645+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:25.009806+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:26.010001+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:27.010196+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:28.010394+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:29.010557+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:30.010710+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.606863022s of 26.614999771s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:31.010904+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:32.011212+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852890 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:33.011395+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:34.011550+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:35.011690+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:36.011846+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:37.012050+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852299 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:38.012225+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:39.012397+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:40.012576+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:41.012812+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:42.013060+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:43.013223+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:44.013350+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:45.013539+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:46.013718+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:47.013864+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:48.014015+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:49.014173+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:50.014322+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:51.014452+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:52.014651+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:53.014814+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:54.014987+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:55.015144+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:56.015323+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:57.015496+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:58.015696+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:59.015905+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:00.016187+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:01.016356+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:02.016546+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:03.016748+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:04.016895+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:05.017064+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:06.017248+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:07.017413+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:08.017630+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:09.017808+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:10.018091+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563658762b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:11.018307+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:12.018571+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:13.018970+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:14.019147+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:15.019366+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:16.019516+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:17.019710+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:18.019838+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:19.019988+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:20.020162+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:21.020406+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:22.020750+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:23.020909+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:24.021130+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.971313477s of 53.995193481s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:25.021273+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:26.021522+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:27.021698+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 5898 writes, 24K keys, 5898 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5898 writes, 1028 syncs, 5.74 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 466 writes, 729 keys, 466 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s
                                           Interval WAL: 466 writes, 228 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:28.021955+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:29.022125+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:30.022342+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:31.022483+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:32.022694+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:33.022930+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:34.023147+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:35.023332+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:36.023564+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:37.023821+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:38.023975+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658211000 session 0x563658763680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365876e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:39.024194+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:40.024426+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:41.024730+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:42.024964+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:43.025177+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:44.025372+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:45.025708+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:46.026012+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:47.026275+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:48.026583+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:49.026974+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:50.027176+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:51.027361+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:52.027580+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:53.027817+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:54.028044+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:55.028235+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.067621231s of 31.088729858s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:56.028461+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:57.028683+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854141 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:58.028921+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:59.029131+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:00.029293+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:01.029464+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:02.029676+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:03.029832+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:04.030506+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:05.030680+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:06.030888+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:07.031040+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:08.031205+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:09.031393+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:10.031533+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:11.031675+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:12.031895+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:13.032040+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:14.032188+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:15.032339+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:16.032543+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:17.032667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:18.032826+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:19.032975+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:20.033211+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:21.033387+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:22.033568+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:23.033742+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:24.033939+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:25.034105+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:26.034286+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:27.034472+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.795867920s of 31.820896149s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:28.034627+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:29.034776+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:30.035016+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:31.035188+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:32.035397+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:33.035561+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:34.035725+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:35.035928+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:36.036065+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:37.036233+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:38.036377+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:39.036537+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:40.036694+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:41.036847+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:42.037039+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:43.037186+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:44.037355+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:45.037510+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:46.037673+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:47.037837+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:48.037990+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365793f860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:49.038117+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:50.038269+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:51.038416+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:52.038588+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:53.038752+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:54.039061+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563658688b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:55.039240+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:56.039432+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:57.039560+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:58.039731+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:59.039897+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:00.040084+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:01.040259+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:02.040492+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.482105255s of 35.065586090s, submitted: 214
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855062 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:03.040672+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:04.040874+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:05.041081+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:06.041245+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:07.041399+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 856574 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:08.042498+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:09.043271+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:10.043810+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:11.044158+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:12.044469+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859598 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:13.045114+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:14.045816+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:15.046580+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:16.046947+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:17.047153+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.189227104s of 15.369211197s, submitted: 4
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:18.047698+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:19.048017+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:20.048215+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:21.048574+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:22.048881+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:23.049246+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:24.049534+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:25.049718+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:26.049961+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:27.050231+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:28.050513+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:29.050739+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:30.050976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:31.051271+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:32.051557+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:33.051811+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:34.052020+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:35.052225+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:36.052446+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:37.052662+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:38.052823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:39.053108+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:40.053361+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:41.053663+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:42.053949+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:43.054121+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:44.054353+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:45.054641+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:46.054914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:47.055173+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:48.055385+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:49.055670+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:50.055893+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:51.056200+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:52.056512+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:53.056705+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:54.056955+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:55.057185+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:56.057350+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e960
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:57.057510+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:58.057732+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:59.057989+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:00.058146+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636587c41e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:01.058390+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:02.058636+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:03.058819+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:04.059000+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:05.059166+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:06.059378+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:07.059607+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:08.059819+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:09.060029+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:10.060200+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:11.060410+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:12.060691+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:13.060854+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.563137054s of 55.571445465s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:14.061105+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:15.061402+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:16.061650+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:17.061865+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:18.062058+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859928 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:19.062280+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:20.062487+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:21.062709+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:22.062933+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:23.063178+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860849 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:24.063378+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:25.063571+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.061788559s of 12.072693825s, submitted: 3
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:26.063828+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:27.064005+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:28.064289+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:29.064517+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:30.064747+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:31.064950+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:32.065214+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:33.065378+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:34.065667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:35.065942+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:36.066200+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:37.066457+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:38.066764+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:39.067159+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:40.067726+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:41.068177+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:42.068752+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:43.069034+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:44.069344+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:45.069823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:46.070116+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:47.070337+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:48.070574+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:49.070823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:50.071030+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.325607300s of 25.328844070s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:51.071254+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:52.071542+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:53.071783+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:54.072009+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:55.072210+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:56.072441+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:57.074179+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:58.074422+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:59.074677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:00.074904+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:01.075054+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:02.075325+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:03.075559+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:04.075833+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:05.076075+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:06.076356+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:07.076713+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:08.076945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:09.077146+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:10.077377+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:11.077702+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:12.078036+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:13.078267+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:14.078496+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:15.078729+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:16.078935+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:17.079110+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:18.079250+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:19.079413+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.008197784s of 29.011583328s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:20.079657+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:21.079914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:22.080246+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:23.080431+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:24.080713+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:25.080942+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:26.081215+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:27.081485+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:28.081690+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:29.081901+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:30.082169+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:31.082362+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:32.082705+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:33.082938+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:34.083176+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:35.083391+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:36.083652+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x5636587ca000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:37.083854+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:38.084061+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:39.084209+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:40.084382+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:41.084560+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:42.084806+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:43.085034+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:44.085210+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:45.085393+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:46.085646+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:47.085818+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:48.086055+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:49.086312+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:50.086486+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 1728512 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.283664703s of 30.289636612s, submitted: 2
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:51.086667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74735616 unmapped: 1687552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:52.086866+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 696320 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 139 ms_handle_reset con 0x563657206c00 session 0x5636587d4000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:53.087034+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 663552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965957 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 139 heartbeat osd_stat(store_statfs(0x4fc27b000/0x0/0x4ffc00000, data 0x8ed7bb/0x99f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:54.087279+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 140 ms_handle_reset con 0x563658210800 session 0x5636587c50e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:55.087468+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:56.087645+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:57.087871+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:58.088091+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd5f919/0xe12000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969647 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:59.088279+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:00.088499+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993000031s of 10.192948341s, submitted: 43
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:01.088688+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:02.088961+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:03.089203+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970213 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:04.089457+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:05.089652+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:06.089897+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:07.090152+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:08.090414+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969622 data_alloc: 218103808 data_used: 73728
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:09.090677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:10.091148+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:11.091404+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:12.091700+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:13.091945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969855 data_alloc: 218103808 data_used: 77824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:14.092190+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.708740234s of 14.725612640s, submitted: 13
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:15.092377+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:16.092578+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:17.092785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:18.092949+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:19.093250+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:20.093465+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:21.093742+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:22.094038+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:23.094355+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:24.094553+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:25.094791+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:26.095070+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:27.095231+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:28.095451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:29.095742+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:30.095997+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:31.096272+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:32.096555+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210800 session 0x5636587d4f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636587d50e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44400 session 0x5636587d54a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:33.096770+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76832768 unmapped: 16375808 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.942977905s of 18.945615768s, submitted: 1
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44800 session 0x5636587d5860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:34.096933+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:35.097097+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636572210e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:36.097300+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:37.097520+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x563657ed6f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x56365876e1e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:38.097707+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x56365874f680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365874f0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365861c960
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1025258 data_alloc: 218103808 data_used: 81920
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:39.097945+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:40.098182+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:41.098436+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657206c00 session 0x5636587ca3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb9f0000/0x0/0x4ffc00000, data 0x1174c29/0x122b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:42.098712+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:43.098974+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x5636587ca000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1024667 data_alloc: 218103808 data_used: 81920
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:44.099200+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x5636587c41e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.451130867s of 10.229439735s, submitted: 42
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 13893632 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x5636586892c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:45.099447+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:46.099672+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:47.099946+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 80183296 unmapped: 13025280 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:48.100117+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:49.100357+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:50.100556+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:51.100805+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:52.101053+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:53.101242+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:54.101383+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:55.101511+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:56.101660+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:57.101822+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.154132843s of 13.616702080s, submitted: 20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:58.101976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 88580096 unmapped: 4628480 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142440 data_alloc: 218103808 data_used: 3985408
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587cb4a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:59.102156+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 1851392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:00.102344+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9d5d000/0x0/0x4ffc00000, data 0x1c58c74/0x1d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:01.102492+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:02.102712+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:03.102886+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161392 data_alloc: 218103808 data_used: 4464640
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:04.103044+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90415104 unmapped: 3842048 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:05.103159+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90423296 unmapped: 3833856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:06.103325+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:07.103501+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:08.103642+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161544 data_alloc: 218103808 data_used: 4534272
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:09.103825+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:10.103989+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395915031s of 12.716011047s, submitted: 144
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c8d000/0x0/0x4ffc00000, data 0x1d36c74/0x1def000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:11.104154+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:12.104326+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:13.104472+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1164216 data_alloc: 218103808 data_used: 4538368
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:14.104677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:15.104818+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:16.105040+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:17.105183+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:18.105392+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90660864 unmapped: 3596288 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166420 data_alloc: 218103808 data_used: 4538368
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:19.105547+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636587cc960
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:20.105676+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:21.106016+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:22.106310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:23.106460+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165829 data_alloc: 218103808 data_used: 4538368
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:24.106695+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.344936371s of 14.540517807s, submitted: 9
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657b5af00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:25.106867+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636577f14a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90554368 unmapped: 4751360 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:26.107662+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587623c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91594752 unmapped: 19603456 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cd0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c52f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x563657221c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:27.108353+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:28.109081+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266778 data_alloc: 218103808 data_used: 4538368
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:29.109252+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:30.109572+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:31.109919+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc74/0x2b68000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:32.110277+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657911c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:33.110523+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91652096 unmapped: 19546112 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:34.110736+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271440 data_alloc: 218103808 data_used: 4542464
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:35.110861+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:36.111166+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:37.111414+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:38.111547+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100147200 unmapped: 11051008 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:39.111714+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366136 data_alloc: 234881024 data_used: 18526208
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104595456 unmapped: 6602752 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.594265938s of 14.785900116s, submitted: 47
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:40.111970+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:41.112286+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:42.112584+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:43.112762+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:44.113065+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1364363 data_alloc: 234881024 data_used: 18526208
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:45.113249+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:46.113426+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:47.113680+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:48.113869+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:49.114060+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1393335 data_alloc: 234881024 data_used: 18522112
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 5136384 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:50.114267+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925302505s of 10.360248566s, submitted: 103
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82aa000/0x0/0x4ffc00000, data 0x3718c97/0x37d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112877568 unmapped: 2727936 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:51.114493+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113098752 unmapped: 2506752 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:52.114810+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:53.114972+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f87000/0x0/0x4ffc00000, data 0x3a2dc97/0x3ae7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f92000/0x0/0x4ffc00000, data 0x3a30c97/0x3aea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:54.115136+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486231 data_alloc: 234881024 data_used: 19931136
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:55.115330+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:56.115671+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:57.116212+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:58.116397+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f8d000/0x0/0x4ffc00000, data 0x3a35c97/0x3aef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563658762b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587c4d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:59.117160+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486275 data_alloc: 234881024 data_used: 19931136
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101130240 unmapped: 14475264 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:00.117319+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.153506279s of 10.047043800s, submitted: 85
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636578ac780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100425728 unmapped: 15179776 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:01.117673+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:02.118009+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c75000/0x0/0x4ffc00000, data 0x1d4dc74/0x1e06000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:03.118304+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:04.118755+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182611 data_alloc: 218103808 data_used: 4526080
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:05.119149+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:06.119314+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:07.119642+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c73000/0x0/0x4ffc00000, data 0x1d50c74/0x1e09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:08.119934+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563656b24000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:09.120179+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018203 data_alloc: 218103808 data_used: 90112
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563656312d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:10.120449+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:11.120709+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:12.120996+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:13.121242+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:14.121484+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:15.121728+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:16.121947+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:17.122165+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:18.122323+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:19.122719+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:20.122937+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:21.123110+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:22.123321+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:23.123574+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:24.123780+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:25.124008+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:26.124187+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:27.124400+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:28.124695+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:29.124978+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:30.125150+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:31.125346+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655bc50e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc4780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc5860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x56365876f0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.414421082s of 31.503835678s, submitted: 50
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 105201664 unmapped: 23019520 heap: 128221184 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x56365876ef00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:32.125473+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed74a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:33.125583+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655b9b4a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:34.125794+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:35.125929+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:36.126070+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:37.126237+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:38.126421+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:39.126553+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:40.126708+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579a3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:41.126900+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:42.127091+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96018432 unmapped: 35880960 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:43.127322+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 98639872 unmapped: 33259520 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:44.127519+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:45.127671+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:46.127847+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:47.127959+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:48.128129+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:49.128240+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:50.128402+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:51.128541+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:52.128792+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657973c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658210800 session 0x5636579732c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:53.128961+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102580224 unmapped: 29319168 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.930128098s of 22.232917786s, submitted: 48
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:54.129122+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,3])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308349 data_alloc: 234881024 data_used: 13668352
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 21618688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:55.129337+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:56.129545+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:57.129750+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:58.129927+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111312896 unmapped: 20586496 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:59.130073+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318541 data_alloc: 234881024 data_used: 13930496
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 20553728 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:00.130234+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 20406272 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:01.130378+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:02.130962+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:03.131170+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:04.131816+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316725 data_alloc: 234881024 data_used: 13930496
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:05.132372+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28afcc6/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:06.132530+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:07.132710+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.554829597s of 13.991487503s, submitted: 126
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:08.132983+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:09.133298+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f910e000/0x0/0x4ffc00000, data 0x28b5cc6/0x296e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316973 data_alloc: 234881024 data_used: 13930496
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:10.133479+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:11.133791+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:12.134076+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:13.134349+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:14.134552+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319549 data_alloc: 234881024 data_used: 13942784
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9103000/0x0/0x4ffc00000, data 0x28c0cc6/0x2979000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:15.134741+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:16.134976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:17.135177+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 21725184 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.098366737s of 10.114171028s, submitted: 5
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:18.135370+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x5636585872c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c534a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:19.135569+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:20.135785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:21.135960+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:22.136183+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:23.136422+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:24.136633+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:25.136956+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:26.137133+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:27.137439+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:28.137687+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:29.137927+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:30.138129+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:31.138400+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:32.138672+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:33.138918+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:34.139090+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:35.139272+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:36.139451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:37.139579+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:38.139750+0000)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.16461 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:39.139916+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.26240 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.25723 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:40.140089+0000)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4008813133' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3691430398' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:41.140278+0000)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3518594572' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2753251066' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:42.140482+0000)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.26261 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.25738 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:43.140573+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2398960754' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1731495337' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636578fc1e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-mon[76053]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365861da40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4130838568' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:44.140784+0000)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/216293876' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657ed7680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101466112 unmapped: 30433280 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-mon[76053]: from='client.26279 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.040988922s of 26.121164322s, submitted: 35
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed7c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636585863c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365876ef00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111180 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc4b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:45.140968+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:46.141150+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:47.141336+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:48.141515+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:49.254025+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1110956 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:50.254168+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc5860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:51.254309+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:52.254659+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:53.254792+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101974016 unmapped: 33603584 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:54.254943+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:55.255079+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:56.255243+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:57.255407+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:58.256135+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:59.256330+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:00.256538+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:01.256722+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:02.256900+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.628173828s of 18.831754684s, submitted: 39
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:03.257096+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 24641536 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9141000/0x0/0x4ffc00000, data 0x246bc64/0x2523000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:04.257296+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112017408 unmapped: 23560192 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312254 data_alloc: 234881024 data_used: 11382784
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:05.257496+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:06.258068+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:07.258242+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f8c000/0x0/0x4ffc00000, data 0x2628c64/0x26e0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:08.258376+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:09.258980+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319394 data_alloc: 234881024 data_used: 11612160
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:10.259303+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:11.259786+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:12.260391+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:13.260798+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:14.260999+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321018 data_alloc: 234881024 data_used: 11685888
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:15.261393+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:16.261672+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.655291557s of 14.004703522s, submitted: 146
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:17.262094+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 24895488 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:18.262439+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:19.262720+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321058 data_alloc: 234881024 data_used: 11685888
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:20.262972+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:21.263247+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:22.263515+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:23.263762+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:24.263917+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 24870912 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321202 data_alloc: 234881024 data_used: 11685888
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x56365793ef00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365793eb40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:25.264132+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98000 session 0x56365793f4a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cbc20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110714880 unmapped: 24862720 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587caf00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x5636587cab40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:26.264287+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:27.264528+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:28.264778+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:29.265004+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f85c2000/0x0/0x4ffc00000, data 0x2ff1cc6/0x30aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 23609344 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1398065 data_alloc: 234881024 data_used: 11685888
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:30.265202+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636587ca3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:31.265368+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:32.265657+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636572454a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.045106888s of 15.331792831s, submitted: 38
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657244780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:33.265832+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 24444928 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859d000/0x0/0x4ffc00000, data 0x3015cd6/0x30cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:34.266035+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400939 data_alloc: 234881024 data_used: 11685888
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:35.266749+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:36.267194+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 24412160 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:37.293184+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 17653760 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:38.293327+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:39.299528+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:40.299725+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:41.299880+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:42.300092+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:43.300271+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:44.300545+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 16179200 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:45.300717+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:46.300908+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:47.301090+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.843387604s of 15.061408043s, submitted: 10
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 13910016 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:48.301307+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 12812288 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:49.301453+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 10928128 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1578891 data_alloc: 234881024 data_used: 21770240
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:50.301668+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 10395648 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:51.301821+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:52.302022+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:53.302762+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:54.302901+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1580563 data_alloc: 234881024 data_used: 21909504
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:55.303045+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365678f860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636586f05a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:56.303179+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:57.303310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636579730e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:58.303449+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:59.303622+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336708 data_alloc: 234881024 data_used: 10829824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:00.303771+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:01.303927+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.856204987s of 14.509012222s, submitted: 130
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636563130e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x5636577f0f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:02.304099+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 25501696 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563656312d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f57000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:03.304239+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:04.304405+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:05.304503+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:06.304664+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:07.304840+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:08.304998+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:09.305157+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:10.305335+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:11.305514+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:12.305735+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:13.305884+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:14.305983+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:15.306172+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:16.306404+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:17.306550+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:18.306682+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:19.306819+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:20.306957+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:21.307108+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:22.307316+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:23.307453+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:24.307635+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:25.307823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:26.307957+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:27.308095+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.2 total, 600.0 interval
                                           Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s
                                           Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d4b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587d5680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d43c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c22f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.636785507s of 26.091878891s, submitted: 58
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c225a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:28.308230+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563656b24000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c310e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:29.308396+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123514 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:30.308555+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365874e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:31.308674+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874e000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365874f2c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:32.308887+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365874f0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:33.309035+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:34.309175+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155373 data_alloc: 218103808 data_used: 4284416
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:35.309309+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:36.309510+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:37.309711+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:38.309885+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:39.310020+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:40.310182+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:41.310409+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:42.310666+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:43.310811+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:44.311918+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:45.312099+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.438076019s of 17.504392624s, submitted: 12
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:46.312985+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:47.313151+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 30244864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:48.313794+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97fe000/0x0/0x4ffc00000, data 0x1db4c84/0x1e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:49.314518+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256563 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:50.315134+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:51.315720+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:52.316260+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97ee000/0x0/0x4ffc00000, data 0x1dc4c84/0x1e7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:53.316702+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:54.317099+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:55.317499+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:56.317881+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:57.318139+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:58.318298+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:59.318451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:00.318708+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:01.319047+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:02.319279+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:03.319440+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:04.319664+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:05.319842+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.239578247s of 20.645868301s, submitted: 49
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:06.320101+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:07.320263+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:08.320501+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:09.320665+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed61e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563657ed6780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed65a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7a000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed7e00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7a000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:10.320806+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290770 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed6f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657ed7c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563655c534a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed6000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b000 session 0x56365874fa40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:11.320948+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:12.321228+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:13.321394+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:14.321667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cad20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:15.321911+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296076 data_alloc: 218103808 data_used: 8269824
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:16.322044+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d3000/0x0/0x4ffc00000, data 0x20deca7/0x2199000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:17.322205+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:18.322890+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.166189194s of 12.288041115s, submitted: 40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:19.323049+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:20.323878+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:21.324052+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:22.324429+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:23.324662+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:24.324910+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:25.325045+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:26.325180+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:27.325329+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 23896064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:28.325524+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.762595177s of 10.001276016s, submitted: 110
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 23732224 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:29.325686+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 21815296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a60000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:30.325905+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655c17000 session 0x563657244000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658d10000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:31.326110+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:32.326411+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:33.326660+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:34.326823+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:35.327003+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:36.327203+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:37.327356+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:38.327515+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:39.327675+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:40.327911+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.802636147s of 12.282186508s, submitted: 208
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:41.328096+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:42.328355+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:43.328504+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:44.328767+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6e000/0x0/0x4ffc00000, data 0x2b43ca7/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:45.328918+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395262 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:46.329077+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:47.329262+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:48.329414+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:49.329564+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:50.329732+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:51.329864+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:52.330086+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:53.330290+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:54.330464+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:55.330621+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.778802872s of 14.837076187s, submitted: 4
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:56.330776+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:57.330950+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:58.331093+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:59.331219+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:00.331344+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395422 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6d000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:01.331495+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:02.331711+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:03.331863+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:04.332006+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:05.332146+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395430 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:06.332310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.249480247s of 11.264521599s, submitted: 5
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:07.332467+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:08.332667+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:09.332842+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:10.332976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395446 data_alloc: 234881024 data_used: 11534336
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:11.333131+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:12.333394+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:13.333541+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:14.333694+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6c000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:15.333820+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 23363584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397446 data_alloc: 234881024 data_used: 11522048
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:16.333988+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.100803375s of 10.120236397s, submitted: 16
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587c5680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x5636587c4d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:17.334178+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7bc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x5636587cc1e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:18.334377+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:19.334561+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:20.334914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:21.335096+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:22.335290+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:23.335536+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:24.335701+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:25.335876+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:26.336074+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:27.336239+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:28.336382+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657911860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587c5c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.864808083s of 12.033938408s, submitted: 54
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:29.336572+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 32022528 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:30.336750+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365678f860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:31.336918+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:32.337167+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:33.337352+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:34.337531+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:35.337744+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:36.337958+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:37.338134+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:38.338383+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:39.338529+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:40.338755+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:41.338923+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:42.339183+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:43.339363+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:44.339627+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:45.339829+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:46.339992+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:47.340207+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:48.340363+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:49.340577+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:50.340902+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:51.341095+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:52.342011+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:53.342244+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:54.342416+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:55.342734+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:56.343375+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:57.343643+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cba40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587cbc20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563655aab0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365678f0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.008203506s of 29.148126602s, submitted: 24
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:58.343834+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110739456 unmapped: 32710656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655b9b2c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563657ed7860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636586f0780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7bc00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x563655c31e00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636577f03c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:59.344058+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc4000/0x0/0x4ffc00000, data 0x18f0c51/0x19a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:00.344479+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188936 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:01.344676+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:02.344839+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110673920 unmapped: 32776192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636563121e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:03.345027+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:04.345248+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:05.345373+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110911488 unmapped: 32538624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245921 data_alloc: 218103808 data_used: 8433664
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:06.345560+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:07.345802+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:08.346196+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:09.346418+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:10.346792+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: mgrc ms_handle_reset ms_handle_reset con 0x563655c16000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec 01 10:28:04 compute-2 ceph-osd[78644]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: get_auth_request con 0x563659b7bc00 auth_method 0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: mgrc handle_mgr_configure stats_period=5
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.887072563s of 12.284139633s, submitted: 32
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636586f1680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 27484160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267991 data_alloc: 234881024 data_used: 11882496
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed6000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:11.347013+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c64/0xe1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:12.347326+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:13.347567+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:14.347805+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:15.347954+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:16.348102+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:17.348299+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:18.348445+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:19.348638+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:20.348831+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:21.349055+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:22.349291+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:23.349513+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862092018s of 12.971082687s, submitted: 35
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,6])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111779840 unmapped: 31670272 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x563656b24780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636577f05a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365861cd20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876fe00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874eb40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:24.349736+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:25.349875+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1160474 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:26.350036+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e44800 session 0x56365678e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:27.350223+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587c4960
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 31752192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563656b661e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:28.350423+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:29.350576+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e45400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:30.350738+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 31711232 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165862 data_alloc: 218103808 data_used: 208896
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:31.350867+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:32.351070+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:33.351511+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:34.351638+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.460638046s of 11.302368164s, submitted: 37
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x56365579a3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e45400 session 0x5636579730e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:35.351758+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111558656 unmapped: 31891456 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117454 data_alloc: 218103808 data_used: 94208
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:36.351971+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 32104448 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84c000/0x0/0x4ffc00000, data 0xd67c74/0xe20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:37.352161+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111362048 unmapped: 32088064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:38.352310+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111370240 unmapped: 32079872 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:39.352453+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c31e00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:40.352686+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:41.352851+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:42.353031+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:43.353208+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:44.353354+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:45.353576+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:46.353795+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:47.353947+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:48.354167+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:49.354380+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:50.417670+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:51.417875+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:52.418086+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:53.418315+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:54.418469+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:55.418607+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:56.418772+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:57.418956+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:58.419096+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:59.419384+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:00.419507+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:01.419740+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:02.419936+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:03.420126+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:04.420301+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:05.420450+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:06.420612+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c521e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655aabc20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636572450e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636586f14a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:07.420764+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.799690247s of 32.735015869s, submitted: 47
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365579a000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:08.420914+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636586f10e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:09.421125+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:10.421256+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154538 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d5e00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:11.421411+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655bc4780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f59c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f59c00 session 0x563657972f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:12.421582+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 32301056 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:13.421826+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 32292864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:14.421966+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111181824 unmapped: 32268288 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:15.422088+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:16.422248+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:17.422490+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:18.422655+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:19.422868+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:20.422993+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:21.423167+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:22.423333+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:23.423490+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:24.423680+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:25.424178+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764535904s of 18.126991272s, submitted: 35
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 25804800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281861 data_alloc: 218103808 data_used: 5279744
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:26.424352+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 25427968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9825000/0x0/0x4ffc00000, data 0x1d81ca3/0x1e39000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:27.424512+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:28.424669+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:29.424841+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:30.424991+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285985 data_alloc: 218103808 data_used: 5517312
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:31.425121+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:32.425276+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:33.425447+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:34.425626+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:35.425850+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286001 data_alloc: 218103808 data_used: 5517312
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:36.425994+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:37.426152+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:38.426287+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:39.426424+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862953186s of 14.593469620s, submitted: 123
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587c4b40
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365793f860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:40.426638+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284997 data_alloc: 218103808 data_used: 5517312
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:41.426829+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113876992 unmapped: 29573120 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x563657d3a3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:42.427037+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:43.427229+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:44.427399+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:45.427567+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:46.427751+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:47.427925+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:48.428136+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:49.428288+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:50.428445+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:51.428677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:52.428927+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:53.429139+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:54.429285+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:55.429461+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:56.429659+0000)
Dec 01 10:28:04 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:57.429783+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:58.429963+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:59.430171+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:00.430336+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:01.430491+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:02.430654+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:03.430785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:04.430925+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:05.431074+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:06.431248+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58800 session 0x5636587d50e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d5680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d45a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587d41e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:07.431361+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.469379425s of 27.278631210s, submitted: 38
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636563121e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658240400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x5636587d43c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658240400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x563655b94f00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c31c20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:08.431507+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:09.431684+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:10.431812+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1171981 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:11.432016+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655aabc20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365876e5a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:12.432171+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365876fe00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:13.432314+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114335744 unmapped: 29114368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:14.432453+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113491968 unmapped: 29958144 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:15.432644+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:16.433042+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:17.433283+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:18.433480+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:19.433668+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:20.433811+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:21.434030+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:22.434268+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:23.434437+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:24.434619+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:25.434802+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.541067123s of 18.684326172s, submitted: 42
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 28631040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:26.434944+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264811 data_alloc: 218103808 data_used: 4960256
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 25976832 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:27.435119+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 23502848 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:28.435326+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587d4000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636563125a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97c00 session 0x5636577f1860
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:29.435502+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636586883c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365876e3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657244d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657afd400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657afd400 session 0x563657ed70e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636587d4d20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0cec/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:30.435715+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:31.435878+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337422 data_alloc: 218103808 data_used: 5795840
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:32.436129+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119332864 unmapped: 24117248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:33.436313+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 23994368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365678fc20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:34.436726+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d1d25/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365793e780
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119463936 unmapped: 23986176 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:35.437451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x56365631b000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x56365631b000 session 0x5636587cb680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:36.437865+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341137 data_alloc: 218103808 data_used: 5799936
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:37.438358+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93de000/0x0/0x4ffc00000, data 0x21d1d58/0x228e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 22298624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:38.438735+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 20692992 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:39.439114+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.288687706s of 13.754534721s, submitted: 156
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 20684800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:40.439317+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:41.439770+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:42.440161+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:43.440323+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:44.440697+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:45.440882+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:46.441050+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:47.441225+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123568128 unmapped: 19881984 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:48.441504+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128155648 unmapped: 15294464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:49.441791+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.859272957s of 10.086807251s, submitted: 50
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:50.442039+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:51.442199+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:52.442690+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:53.442908+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:54.443067+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:55.443256+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:56.443460+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:57.443694+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:58.443933+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:59.444162+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:00.444379+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:01.444614+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1456703 data_alloc: 234881024 data_used: 12951552
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:02.444836+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:03.445059+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.702910423s of 13.609095573s, submitted: 14
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365876ef00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587ca3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657b5a3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:04.445265+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:05.445452+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:06.445627+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301916 data_alloc: 218103808 data_used: 5804032
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:07.445767+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x56365874ed20
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:08.445938+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f95c9000/0x0/0x4ffc00000, data 0x1bd8cb3/0x1c91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:09.446110+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:10.446260+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:11.446451+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:12.446687+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:13.446850+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:14.447010+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:15.447194+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:16.447337+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:17.447517+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:18.447689+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:19.447817+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:20.447980+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:21.448115+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:22.448320+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:23.448513+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:24.448678+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:25.448851+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:26.448974+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:27.449163+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:28.449351+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:29.449502+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:30.449677+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:31.449800+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:32.450021+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.523105621s of 29.712411880s, submitted: 70
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:33.450184+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656b24000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:34.450342+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:35.450475+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:36.450643+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254732 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:37.450788+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365678e3c0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:38.450931+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587c50e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:39.451182+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636578fd0e0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636586885a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:40.451339+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:41.451511+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122322944 unmapped: 28999680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1317052 data_alloc: 234881024 data_used: 9334784
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:42.451683+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:43.451872+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:44.452055+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:45.452216+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:46.452388+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348212 data_alloc: 234881024 data_used: 14024704
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:47.452586+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:48.452795+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:49.452971+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:50.453154+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:51.453301+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.175664902s of 18.268918991s, submitted: 18
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 127483904 unmapped: 23838720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1430746 data_alloc: 234881024 data_used: 14032896
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b00000/0x0/0x4ffc00000, data 0x269fc41/0x2756000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:52.453487+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132423680 unmapped: 18898944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:53.453669+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132751360 unmapped: 18571264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:54.453846+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:55.454044+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:56.454253+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132816896 unmapped: 18505728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453958 data_alloc: 234881024 data_used: 14815232
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:57.454391+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:58.454516+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:59.454725+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:00.454834+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:01.454979+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449662 data_alloc: 234881024 data_used: 14823424
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:02.455183+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:03.455412+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:04.455629+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.859183311s of 13.162115097s, submitted: 116
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:05.455785+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:06.455938+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655c534a0
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449854 data_alloc: 234881024 data_used: 14823424
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:07.456096+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365861cf00
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:08.456242+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:09.456476+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:10.457666+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:11.457995+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:12.458409+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:13.459042+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:14.459530+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:15.460041+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:16.460586+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:17.461103+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:18.461733+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:19.462478+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:20.462963+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:21.463274+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:22.463654+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:23.463943+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:24.464169+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:25.464678+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:26.465088+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:27.465462+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:28.465812+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:29.466208+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:30.466750+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:31.467100+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:32.467703+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:33.467943+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:34.468210+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:35.468450+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:36.468688+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:37.468983+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:38.469226+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:39.469554+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:40.469804+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:41.470068+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:42.471964+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:43.473364+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:44.475016+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:45.475906+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:46.477261+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:47.478832+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:48.479051+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:49.479293+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:50.480238+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:51.481093+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:52.481578+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:53.481850+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:54.482207+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:55.482424+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:56.482754+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:57.483224+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:58.483461+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:59.483805+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:00.483999+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:01.484354+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:02.484726+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:03.484952+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:04.485293+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:05.485484+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:06.485657+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:07.485816+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:08.485967+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:09.486123+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:10.486311+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:11.486487+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:12.486786+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:13.486952+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:14.487160+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:15.487309+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:16.487482+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:17.487661+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:18.487849+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:19.488002+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:20.488150+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:21.488284+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:22.488488+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122617856 unmapped: 28704768 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:23.488660+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:24.488824+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:25.488941+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:26.489101+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:27.489239+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:28.489385+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:29.489567+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:30.489658+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 29630464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:31.489799+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:28:04 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:28:04 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 29777920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:32.489976+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:28:04 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:33.490651+0000)
Dec 01 10:28:04 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:28:04 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:28:04 compute-2 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec 01 10:28:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.716 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:28:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.717 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:28:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:28:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 01 10:28:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1259430974' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:04.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 01 10:28:05 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3668651258' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:28:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:05.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:28:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 01 10:28:05 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/649092568' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.25759 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.16503 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1259430974' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.26294 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3568902322' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.25774 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2946393104' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.16518 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3668651258' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/274395677' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.26306 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/282264157' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: from='client.25795 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 01 10:28:06 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/727522014' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:28:06 compute-2 crontab[243460]: (root) LIST (root)
Dec 01 10:28:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 01 10:28:06 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1397456884' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:28:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.16536 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/649092568' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.26312 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2453994286' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.25813 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3177369341' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.16551 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.26327 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/727522014' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2172107929' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.25828 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1397456884' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.16569 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1374163346' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:28:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:28:06 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4041174835' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.718 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:28:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.881 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:28:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:06.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4980MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.966 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.966 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:28:06 compute-2 nova_compute[230216]: 2025-12-01 10:28:06.990 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:28:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 01 10:28:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3604648933' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:28:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:07.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:28:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3562990374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:07 compute-2 nova_compute[230216]: 2025-12-01 10:28:07.465 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:28:07 compute-2 nova_compute[230216]: 2025-12-01 10:28:07.472 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:28:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 01 10:28:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/324145126' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:28:07 compute-2 sudo[243706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:28:07 compute-2 sudo[243706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:07 compute-2 sudo[243706]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:07 compute-2 nova_compute[230216]: 2025-12-01 10:28:07.690 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:28:07 compute-2 nova_compute[230216]: 2025-12-01 10:28:07.692 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:28:07 compute-2 nova_compute[230216]: 2025-12-01 10:28:07.692 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.26345 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.25840 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4041174835' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.16584 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.26363 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.25855 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2172100142' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2608203925' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2608203925' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3604648933' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2805994041' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.16596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.26378 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.25879 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/794302548' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3562990374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/324145126' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:28:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 01 10:28:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3262835888' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 01 10:28:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/359585233' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 01 10:28:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2117790772' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.16620 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.26408 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.25897 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2770499777' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3262835888' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.16626 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3253431688' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3798853096' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/770018958' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/359585233' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.16641 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2117790772' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4130972969' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2428953660' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2378032138' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/393227238' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 01 10:28:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4094352961' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 01 10:28:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2510048211' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:28:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:08.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 01 10:28:09 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1772599635' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 01 10:28:09 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1045143495' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:28:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:09.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 01 10:28:09 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2860168883' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 01 10:28:09 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3698818619' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4094352961' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2510048211' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.16668 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/668896965' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1371982183' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/312117713' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1772599635' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1045143495' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2057561117' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1435054120' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2573024948' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3388144832' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4268465877' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/200579190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2860168883' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3698818619' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:28:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:10 compute-2 systemd[1]: Starting Hostname Service...
Dec 01 10:28:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 01 10:28:10 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1067950114' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:28:10 compute-2 systemd[1]: Started Hostname Service.
Dec 01 10:28:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 01 10:28:10 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/492178373' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 01 10:28:10 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3722669865' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 01 10:28:10 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3466911185' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:10.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2269291642' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2352800766' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/806825132' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2712621208' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1067950114' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/492178373' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2625422848' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2311727613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4070912794' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3035109161' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3724199474' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3722669865' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3466911185' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2548317732' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 01 10:28:11 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1557592794' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:11.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.26546 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/301400875' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3822245562' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.26020 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.26564 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1557592794' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/760713574' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.26570 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/252266620' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.26032 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2484782800' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3356989962' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2746776080' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 01 10:28:12 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/33292853' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:12 compute-2 podman[244470]: 2025-12-01 10:28:12.442977619 +0000 UTC m=+0.096091711 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:28:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec 01 10:28:12 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1382439860' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.26038 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.26582 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.26050 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.26597 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.16797 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.16803 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/33292853' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.26615 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3826764103' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3458671968' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1382439860' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:28:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1132174747' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:13.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 01 10:28:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3684389087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 01 10:28:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/771270450' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:28:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.16818 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.26077 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.26624 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.16827 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.26636 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.16839 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.26089 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3672121130' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3684389087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4008874652' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/771270450' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3942174856' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:14 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 01 10:28:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4157827772' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:28:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:14.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.26648 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.16851 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.26104 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.16869 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.26125 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3320534148' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.16887 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/831062327' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4157827772' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1043111903' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2455707800' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:15 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:15.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 01 10:28:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2572542720' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec 01 10:28:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2329117751' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:28:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:16 compute-2 podman[244911]: 2025-12-01 10:28:16.133413866 +0000 UTC m=+0.057966324 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.26699 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.16893 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.26170 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.16917 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2572542720' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2646154821' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/251848729' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2329117751' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1300857708' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 01 10:28:16 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/343288871' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 01 10:28:16 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/670601805' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:28:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.16947 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/343288871' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3135176345' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1550480148' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/670601805' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2960432479' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3841472996' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:28:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:17 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 01 10:28:17 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1852991175' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:28:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 01 10:28:18 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/350658242' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.26759 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.26218 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/349683807' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1852991175' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2896343475' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3805565623' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/350658242' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:28:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:18 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 01 10:28:18 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2905400677' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 10:28:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:19.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.26789 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.16980 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3202030290' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 10:28:19 compute-2 ceph-mon[76053]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/633188691' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2905400677' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 10:28:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1210079615' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:28:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.26245 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.26801 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.26807 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.26257 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2630146053' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1572732039' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/494296196' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec 01 10:28:20 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2569575389' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 10:28:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:20.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec 01 10:28:21 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/812675118' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 10:28:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:21.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.17007 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.26266 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.17022 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.26834 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2569575389' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.26275 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.26843 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/812675118' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1445338784' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/241861749' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 01 10:28:21 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3711842620' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 01 10:28:21 compute-2 ovs-appctl[246057]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 10:28:21 compute-2 ovs-appctl[246068]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 10:28:21 compute-2 ovs-appctl[246074]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 01 10:28:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.26290 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2911707120' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.17046 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.26861 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.26296 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.17055 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.26867 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1369578988' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 01 10:28:22 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/284725427' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 01 10:28:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:22.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:22 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec 01 10:28:22 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/929110889' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 10:28:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:23.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2400025989' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/929110889' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1424096683' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.26320 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/863008765' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:23 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4163488074' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 01 10:28:24 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1048913038' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:24 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec 01 10:28:24 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4053726301' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.26326 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.26903 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1086875295' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1048913038' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/856131792' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3854596344' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4053726301' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 01 10:28:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:24.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec 01 10:28:25 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1411535655' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:25.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:25 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3750332313' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 10:28:25 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3187522058' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:25 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1411535655' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:25 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1523324552' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec 01 10:28:25 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2202643460' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec 01 10:28:26 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3880773461' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 10:28:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:26.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:27.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:27 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec 01 10:28:27 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2941953303' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:27 compute-2 podman[247471]: 2025-12-01 10:28:27.469666846 +0000 UTC m=+0.108493566 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.17121 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.26356 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2198529714' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1184195527' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2202643460' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.26948 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4048496310' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3880773461' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/446968811' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1510056962' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:27 compute-2 sudo[247512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:28:27 compute-2 sudo[247512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:27 compute-2 sudo[247512]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec 01 10:28:28 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2484993663' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:28 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2609922541' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3039484457' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2941953303' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2484993663' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec 01 10:28:28 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/436766211' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:28.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:29 compute-2 sudo[247641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:28:29 compute-2 sudo[247641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:29 compute-2 sudo[247641]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:29 compute-2 sudo[247666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:28:29 compute-2 sudo[247666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:29 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec 01 10:28:29 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092829110' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:29.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.17154 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.26969 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/805398601' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.26401 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4139865496' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.26984 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/436766211' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.26990 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1851788279' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2946772512' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1092829110' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 ceph-mon[76053]: from='client.17175 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:29 compute-2 sudo[247666]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec 01 10:28:30 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3339444330' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3869686804' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.26422 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4200155020' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.27014 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3339444330' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.17196 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='client.27020 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:30 compute-2 ceph-mon[76053]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:28:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:28:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:30.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:28:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec 01 10:28:31 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1560859100' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:31.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.26440 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.17205 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3026284103' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.26446 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1544160' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3197457651' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1560859100' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1496822608' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec 01 10:28:31 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/91251157' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.27047 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/91251157' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.17229 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.27053 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.17241 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.26470 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2673366903' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3615877885' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:28:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:32.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:28:32 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec 01 10:28:32 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2199592520' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:33 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec 01 10:28:33 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1613278691' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.26476 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2199592520' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2766592132' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4148240341' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.17265 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1613278691' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 01 10:28:33 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 10:28:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:33 compute-2 systemd[1]: Starting Time & Date Service...
Dec 01 10:28:34 compute-2 systemd[1]: Started Time & Date Service.
Dec 01 10:28:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:34 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 01 10:28:34 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3594488495' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-mon[76053]: from='client.26500 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-mon[76053]: from='client.17268 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-mon[76053]: from='client.26509 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4227227138' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-mon[76053]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:34 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/211240088' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:28:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:34.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:28:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec 01 10:28:35 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4001628261' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:35.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:35 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3594488495' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:35 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4001628261' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 01 10:28:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:36 compute-2 sudo[248531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:28:36 compute-2 sudo[248531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:36 compute-2 sudo[248531]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:36.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:37 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:37 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:28:37 compute-2 ceph-mon[76053]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:37.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:38 compute-2 ceph-mon[76053]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:38.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:39.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:28:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:40.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:41 compute-2 ceph-mon[76053]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:28:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:41.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:42 compute-2 ceph-mon[76053]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:43.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:43 compute-2 podman[248562]: 2025-12-01 10:28:43.390052809 +0000 UTC m=+0.048853851 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 10:28:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:44 compute-2 ceph-mon[76053]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:44.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:45.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:46 compute-2 podman[248586]: 2025-12-01 10:28:46.390562636 +0000 UTC m=+0.054057469 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 01 10:28:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:46 compute-2 ceph-mon[76053]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:46.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:47.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:48 compute-2 sudo[248608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:28:48 compute-2 sudo[248608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:28:48 compute-2 sudo[248608]: pam_unix(sudo:session): session closed for user root
Dec 01 10:28:48 compute-2 ceph-mon[76053]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:28:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:28:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:49.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:50 compute-2 ceph-mon[76053]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:50.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:51.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:52 compute-2 ceph-mon[76053]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:52.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:53.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:54 compute-2 ceph-mon[76053]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:54.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:55.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:28:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:28:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:56 compute-2 ceph-mon[76053]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:28:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:56.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:57.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:58 compute-2 podman[248645]: 2025-12-01 10:28:58.473402277 +0000 UTC m=+0.116638255 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 10:28:58 compute-2 nova_compute[230216]: 2025-12-01 10:28:58.692 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:58 compute-2 nova_compute[230216]: 2025-12-01 10:28:58.693 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:28:58 compute-2 ceph-mon[76053]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:28:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:58.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:28:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:28:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:28:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:28:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:00 compute-2 nova_compute[230216]: 2025-12-01 10:29:00.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:00 compute-2 nova_compute[230216]: 2025-12-01 10:29:00.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:29:00 compute-2 nova_compute[230216]: 2025-12-01 10:29:00.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:29:00 compute-2 nova_compute[230216]: 2025-12-01 10:29:00.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:29:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:00 compute-2 ceph-mon[76053]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:00 compute-2 sshd-session[248637]: Received disconnect from 45.78.219.119 port 35996:11: Bye Bye [preauth]
Dec 01 10:29:00 compute-2 sshd-session[248637]: Disconnected from authenticating user root 45.78.219.119 port 35996 [preauth]
Dec 01 10:29:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:00.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:01.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:02 compute-2 nova_compute[230216]: 2025-12-01 10:29:02.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:02 compute-2 ceph-mon[76053]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:02.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:03.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:04 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 01 10:29:04 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 01 10:29:04 compute-2 nova_compute[230216]: 2025-12-01 10:29:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:04 compute-2 nova_compute[230216]: 2025-12-01 10:29:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:04 compute-2 nova_compute[230216]: 2025-12-01 10:29:04.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:29:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:04 compute-2 ceph-mon[76053]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.717 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:29:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:29:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:29:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:04.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:06 compute-2 nova_compute[230216]: 2025-12-01 10:29:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:06 compute-2 nova_compute[230216]: 2025-12-01 10:29:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:06 compute-2 ceph-mon[76053]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:06.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:29:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:07.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2726321627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:29:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2726321627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:29:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:29:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2222777337' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.704 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:29:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.899 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.901 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5031MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.901 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.902 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.976 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.977 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:29:07 compute-2 nova_compute[230216]: 2025-12-01 10:29:07.996 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:29:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:29:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/543342490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:08 compute-2 nova_compute[230216]: 2025-12-01 10:29:08.425 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:29:08 compute-2 nova_compute[230216]: 2025-12-01 10:29:08.431 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:29:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2222777337' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/543342490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:08 compute-2 ceph-mon[76053]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:08 compute-2 nova_compute[230216]: 2025-12-01 10:29:08.637 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:29:08 compute-2 nova_compute[230216]: 2025-12-01 10:29:08.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:29:08 compute-2 nova_compute[230216]: 2025-12-01 10:29:08.640 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:29:08 compute-2 sudo[248729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:29:08 compute-2 sudo[248729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:08 compute-2 sudo[248729]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:08.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/697846383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:09 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4009921699' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:29:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:10 compute-2 ceph-mon[76053]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:10.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:11.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/802762943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:12 compute-2 ceph-mon[76053]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/756928821' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:29:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:12.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:13.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:14 compute-2 podman[248760]: 2025-12-01 10:29:14.410640794 +0000 UTC m=+0.063872705 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:29:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:14 compute-2 ceph-mon[76053]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:15.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:16 compute-2 ceph-mon[76053]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:16.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:29:17 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6824 writes, 36K keys, 6824 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6824 writes, 6824 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1585 writes, 8157 keys, 1585 commit groups, 1.0 writes per commit group, ingest: 18.27 MB, 0.03 MB/s
                                           Interval WAL: 1585 writes, 1585 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    113.3      0.44              0.15        18    0.025       0      0       0.0       0.0
                                             L6      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    101.8     87.6      2.55              0.59        17    0.150     94K   9344       0.0       0.0
                                            Sum      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5     86.8     91.4      2.99              0.73        35    0.086     94K   9344       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9     53.6     54.7      1.25              0.17         8    0.156     26K   2583       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    101.8     87.6      2.55              0.59        17    0.150     94K   9344       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    113.9      0.44              0.15        17    0.026       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.049, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.27 GB write, 0.11 MB/s write, 0.25 GB read, 0.11 MB/s read, 3.0 seconds
                                           Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 1.2 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 22.57 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000271 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1387,21.84 MB,7.18287%) FilterBlock(35,275.17 KB,0.0883956%) IndexBlock(35,476.58 KB,0.153095%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 01 10:29:17 compute-2 podman[248781]: 2025-12-01 10:29:17.407385446 +0000 UTC m=+0.058092041 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 01 10:29:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:18 compute-2 ceph-mon[76053]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:18.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:19.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:20 compute-2 ceph-mon[76053]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:20.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:21.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:22 compute-2 ceph-mon[76053]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:22.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:23.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:24 compute-2 ceph-mon[76053]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:24.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:25.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:29:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:26 compute-2 ceph-mon[76053]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:26.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:27 compute-2 sudo[241426]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:27 compute-2 sshd-session[241425]: Received disconnect from 192.168.122.10 port 41272:11: disconnected by user
Dec 01 10:29:27 compute-2 sshd-session[241425]: Disconnected from user zuul 192.168.122.10 port 41272
Dec 01 10:29:27 compute-2 sshd-session[241397]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:29:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:27 compute-2 systemd[1]: session-55.scope: Deactivated successfully.
Dec 01 10:29:27 compute-2 systemd[1]: session-55.scope: Consumed 2min 57.658s CPU time, 781.4M memory peak, read 324.8M from disk, written 205.8M to disk.
Dec 01 10:29:27 compute-2 systemd-logind[795]: Session 55 logged out. Waiting for processes to exit.
Dec 01 10:29:27 compute-2 systemd-logind[795]: Removed session 55.
Dec 01 10:29:27 compute-2 sshd-session[248811]: Accepted publickey for zuul from 192.168.122.10 port 32852 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:29:27 compute-2 systemd-logind[795]: New session 56 of user zuul.
Dec 01 10:29:27 compute-2 systemd[1]: Started Session 56 of User zuul.
Dec 01 10:29:27 compute-2 sshd-session[248811]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:29:27 compute-2 sudo[248815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-2-2025-12-01-shjyvid.tar.xz
Dec 01 10:29:27 compute-2 sudo[248815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:29:27 compute-2 sudo[248815]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:27 compute-2 sshd-session[248814]: Received disconnect from 192.168.122.10 port 32852:11: disconnected by user
Dec 01 10:29:27 compute-2 sshd-session[248814]: Disconnected from user zuul 192.168.122.10 port 32852
Dec 01 10:29:27 compute-2 sshd-session[248811]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:29:27 compute-2 systemd[1]: session-56.scope: Deactivated successfully.
Dec 01 10:29:27 compute-2 systemd-logind[795]: Session 56 logged out. Waiting for processes to exit.
Dec 01 10:29:27 compute-2 systemd-logind[795]: Removed session 56.
Dec 01 10:29:27 compute-2 sshd-session[248840]: Accepted publickey for zuul from 192.168.122.10 port 43064 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:29:27 compute-2 systemd-logind[795]: New session 57 of user zuul.
Dec 01 10:29:27 compute-2 systemd[1]: Started Session 57 of User zuul.
Dec 01 10:29:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:27 compute-2 sshd-session[248840]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:29:27 compute-2 sudo[248844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 01 10:29:27 compute-2 sudo[248844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:29:27 compute-2 sudo[248844]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:28 compute-2 sshd-session[248843]: Received disconnect from 192.168.122.10 port 43064:11: disconnected by user
Dec 01 10:29:28 compute-2 sshd-session[248843]: Disconnected from user zuul 192.168.122.10 port 43064
Dec 01 10:29:28 compute-2 sshd-session[248840]: pam_unix(sshd:session): session closed for user zuul
Dec 01 10:29:28 compute-2 systemd[1]: session-57.scope: Deactivated successfully.
Dec 01 10:29:28 compute-2 systemd-logind[795]: Session 57 logged out. Waiting for processes to exit.
Dec 01 10:29:28 compute-2 systemd-logind[795]: Removed session 57.
Dec 01 10:29:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:28 compute-2 ceph-mon[76053]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:28 compute-2 sudo[248869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:29:28 compute-2 sudo[248869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:28 compute-2 sudo[248869]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:28 compute-2 podman[248893]: 2025-12-01 10:29:28.919627223 +0000 UTC m=+0.116598683 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 01 10:29:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:28.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:30 compute-2 ceph-mon[76053]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:30.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:31.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:32 compute-2 ceph-mon[76053]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:32.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:33.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:35 compute-2 ceph-mon[76053]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:36 compute-2 ceph-mon[76053]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:36 compute-2 sudo[248929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:29:36 compute-2 sudo[248929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:36 compute-2 sudo[248929]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:36 compute-2 sudo[248954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 01 10:29:36 compute-2 sudo[248954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:36.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:36 compute-2 sudo[248954]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:37 compute-2 sudo[249000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:29:37 compute-2 sudo[249000]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:37 compute-2 sudo[249000]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:37 compute-2 sudo[249025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:29:37 compute-2 sudo[249025]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:37 compute-2 sudo[249025]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:29:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:29:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:38.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:39.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:39 compute-2 ceph-mon[76053]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Dec 01 10:29:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:40 compute-2 ceph-mon[76053]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Dec 01 10:29:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:29:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:41.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:42.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:42 compute-2 ceph-mon[76053]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 819 B/s rd, 0 op/s
Dec 01 10:29:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:44 compute-2 sudo[249088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:29:44 compute-2 sudo[249088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:44 compute-2 sudo[249088]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:44.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:45 compute-2 ceph-mon[76053]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Dec 01 10:29:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:29:45 compute-2 podman[249113]: 2025-12-01 10:29:45.398520908 +0000 UTC m=+0.053529159 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:29:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:45.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:46.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:47 compute-2 ceph-mon[76053]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 819 B/s rd, 0 op/s
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.450015) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987450196, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2960, "num_deletes": 506, "total_data_size": 6729723, "memory_usage": 6866288, "flush_reason": "Manual Compaction"}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec 01 10:29:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:47.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987475939, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4347942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33672, "largest_seqno": 36627, "table_properties": {"data_size": 4335363, "index_size": 7537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3973, "raw_key_size": 33338, "raw_average_key_size": 21, "raw_value_size": 4306800, "raw_average_value_size": 2737, "num_data_blocks": 321, "num_entries": 1573, "num_filter_entries": 1573, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584788, "oldest_key_time": 1764584788, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 25975 microseconds, and 12791 cpu microseconds.
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.476002) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4347942 bytes OK
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.476021) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477162) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477174) EVENT_LOG_v1 {"time_micros": 1764584987477170, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477193) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6714881, prev total WAL file size 6714881, number of live WAL files 2.
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.478545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4246KB)], [63(13MB)]
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987478609, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18969885, "oldest_snapshot_seqno": -1}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6814 keys, 16744517 bytes, temperature: kUnknown
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987591059, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 16744517, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16697145, "index_size": 29212, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17093, "raw_key_size": 175915, "raw_average_key_size": 25, "raw_value_size": 16572913, "raw_average_value_size": 2432, "num_data_blocks": 1172, "num_entries": 6814, "num_filter_entries": 6814, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.591334) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 16744517 bytes
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.592804) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.6 rd, 148.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 13.9 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(8.2) write-amplify(3.9) OK, records in: 7845, records dropped: 1031 output_compression: NoCompression
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.592825) EVENT_LOG_v1 {"time_micros": 1764584987592815, "job": 38, "event": "compaction_finished", "compaction_time_micros": 112546, "compaction_time_cpu_micros": 32199, "output_level": 6, "num_output_files": 1, "total_output_size": 16744517, "num_input_records": 7845, "num_output_records": 6814, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987593964, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987597076, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.478457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597200) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597203) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:29:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:48 compute-2 podman[249137]: 2025-12-01 10:29:48.402493087 +0000 UTC m=+0.062485840 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125)
Dec 01 10:29:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:48 compute-2 sudo[249157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:29:48 compute-2 sudo[249157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:29:48 compute-2 sudo[249157]: pam_unix(sudo:session): session closed for user root
Dec 01 10:29:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:48.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:49.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:49 compute-2 ceph-mon[76053]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Dec 01 10:29:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:50 compute-2 ceph-mon[76053]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:50.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:51.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:52 compute-2 ceph-mon[76053]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:52.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:29:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:53.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:29:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:54.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:55.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:55 compute-2 ceph-mon[76053]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:29:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:29:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:29:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:57.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:29:57 compute-2 ceph-mon[76053]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:29:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:57.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:29:57 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.2 total, 600.0 interval
                                           Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 10K writes, 2921 syncs, 3.62 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2319 writes, 8267 keys, 2319 commit groups, 1.0 writes per commit group, ingest: 9.21 MB, 0.02 MB/s
                                           Interval WAL: 2319 writes, 967 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 10:29:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:58 compute-2 nova_compute[230216]: 2025-12-01 10:29:58.641 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:59.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:59 compute-2 nova_compute[230216]: 2025-12-01 10:29:59.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:29:59 compute-2 ceph-mon[76053]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:29:59 compute-2 podman[249192]: 2025-12-01 10:29:59.444083641 +0000 UTC m=+0.100072956 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 01 10:29:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:29:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:29:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:29:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:59.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:29:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:00 compute-2 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 10:30:00 compute-2 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Dec 01 10:30:00 compute-2 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec 01 10:30:00 compute-2 ceph-mon[76053]:      osd.2 observed slow operation indications in BlueStore
Dec 01 10:30:00 compute-2 ceph-mon[76053]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Dec 01 10:30:00 compute-2 ceph-mon[76053]:     daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1 is in error state
Dec 01 10:30:00 compute-2 ceph-mon[76053]:     daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2 is in error state
Dec 01 10:30:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.736427) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000736529, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 379, "num_deletes": 251, "total_data_size": 404383, "memory_usage": 412416, "flush_reason": "Manual Compaction"}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000740211, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 247709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36632, "largest_seqno": 37006, "table_properties": {"data_size": 245482, "index_size": 391, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6170, "raw_average_key_size": 20, "raw_value_size": 240987, "raw_average_value_size": 797, "num_data_blocks": 17, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584988, "oldest_key_time": 1764584988, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 3809 microseconds, and 1303 cpu microseconds.
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.740253) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 247709 bytes OK
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.740270) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741853) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741889) EVENT_LOG_v1 {"time_micros": 1764585000741881, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741906) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 401863, prev total WAL file size 401863, number of live WAL files 2.
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.742274) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(241KB)], [66(15MB)]
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000742301, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16992226, "oldest_snapshot_seqno": -1}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6606 keys, 12891387 bytes, temperature: kUnknown
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000816653, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12891387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12850168, "index_size": 23571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 171821, "raw_average_key_size": 26, "raw_value_size": 12734334, "raw_average_value_size": 1927, "num_data_blocks": 937, "num_entries": 6606, "num_filter_entries": 6606, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.821534) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12891387 bytes
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.826719) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 228.2 rd, 173.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.0 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(120.6) write-amplify(52.0) OK, records in: 7116, records dropped: 510 output_compression: NoCompression
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.826746) EVENT_LOG_v1 {"time_micros": 1764585000826735, "job": 40, "event": "compaction_finished", "compaction_time_micros": 74457, "compaction_time_cpu_micros": 27160, "output_level": 6, "num_output_files": 1, "total_output_size": 12891387, "num_input_records": 7116, "num_output_records": 6606, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000827044, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000830238, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.742225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:01.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:01 compute-2 nova_compute[230216]: 2025-12-01 10:30:01.258 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:01 compute-2 ceph-mon[76053]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:01.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:03.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:03 compute-2 ceph-mon[76053]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:03.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:04 compute-2 nova_compute[230216]: 2025-12-01 10:30:04.225 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:30:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:30:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:30:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:05.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:05.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:05 compute-2 ceph-mon[76053]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:06 compute-2 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:06 compute-2 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:06 compute-2 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:30:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:06 compute-2 ceph-mon[76053]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:30:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:30:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:30:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:30:07 compute-2 nova_compute[230216]: 2025-12-01 10:30:07.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:07.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:30:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:30:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:08 compute-2 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:08 compute-2 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:08 compute-2 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 10:30:08 compute-2 nova_compute[230216]: 2025-12-01 10:30:08.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 10:30:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:08 compute-2 ceph-mon[76053]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4263249999' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:08 compute-2 sudo[249230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:30:08 compute-2 sudo[249230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:08 compute-2 sudo[249230]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.259 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.259 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:30:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:30:09 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:30:09 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3678122586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.692 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.865 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.866 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5196MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.867 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:30:09 compute-2 nova_compute[230216]: 2025-12-01 10:30:09.867 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:30:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1315727704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3678122586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:10 compute-2 ceph-mon[76053]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:11.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:11.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.518 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.519 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.693 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.794 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.795 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.816 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.870 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 10:30:11 compute-2 nova_compute[230216]: 2025-12-01 10:30:11.889 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:30:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:30:12 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2611264197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:12 compute-2 nova_compute[230216]: 2025-12-01 10:30:12.357 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:30:12 compute-2 nova_compute[230216]: 2025-12-01 10:30:12.363 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:30:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:13.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:13 compute-2 ceph-mon[76053]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2611264197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:13.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:14 compute-2 nova_compute[230216]: 2025-12-01 10:30:14.059 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:30:14 compute-2 nova_compute[230216]: 2025-12-01 10:30:14.061 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:30:14 compute-2 nova_compute[230216]: 2025-12-01 10:30:14.061 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:30:14 compute-2 nova_compute[230216]: 2025-12-01 10:30:14.062 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:15.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:15 compute-2 ceph-mon[76053]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1438712320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:15.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1105278184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:30:16 compute-2 podman[249307]: 2025-12-01 10:30:16.421724943 +0000 UTC m=+0.083404727 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 01 10:30:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:17.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:17 compute-2 ceph-mon[76053]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:17.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:19.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:19 compute-2 ceph-mon[76053]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:19 compute-2 podman[249328]: 2025-12-01 10:30:19.41301615 +0000 UTC m=+0.066028707 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 01 10:30:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:19.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:21.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:21 compute-2 ceph-mon[76053]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:21.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:23 compute-2 ceph-mon[76053]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:23.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:25 compute-2 ceph-mon[76053]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:30:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:25.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:26 compute-2 ceph-mon[76053]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:27.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:27.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:28 compute-2 ceph-mon[76053]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:29 compute-2 sudo[249358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:30:29 compute-2 sudo[249358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:29 compute-2 sudo[249358]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.010000239s ======
Dec 01 10:30:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.010000239s
Dec 01 10:30:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:29.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:30 compute-2 podman[249385]: 2025-12-01 10:30:30.42963657 +0000 UTC m=+0.087155688 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 01 10:30:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:30 compute-2 ceph-mon[76053]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:31.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:31.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:32 compute-2 ceph-mon[76053]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:33.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:33.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:34 compute-2 ceph-mon[76053]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec 01 10:30:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:35.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec 01 10:30:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:35.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:37.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:37 compute-2 ceph-mon[76053]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:39 compute-2 ceph-mon[76053]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:39.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:30:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:41 compute-2 ceph-mon[76053]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:43 compute-2 ceph-mon[76053]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:44 compute-2 sudo[249425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:30:44 compute-2 sudo[249425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:44 compute-2 sudo[249425]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:44 compute-2 sudo[249450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:30:44 compute-2 sudo[249450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:45 compute-2 ceph-mon[76053]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:30:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:45 compute-2 sudo[249450]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:45.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.858839) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045858941, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 677, "num_deletes": 250, "total_data_size": 1202000, "memory_usage": 1219088, "flush_reason": "Manual Compaction"}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045865735, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 786181, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37011, "largest_seqno": 37683, "table_properties": {"data_size": 782939, "index_size": 1150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6668, "raw_average_key_size": 16, "raw_value_size": 776411, "raw_average_value_size": 1950, "num_data_blocks": 51, "num_entries": 398, "num_filter_entries": 398, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585001, "oldest_key_time": 1764585001, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 6915 microseconds, and 2728 cpu microseconds.
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.865767) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 786181 bytes OK
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.865780) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867348) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867359) EVENT_LOG_v1 {"time_micros": 1764585045867355, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867376) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1198331, prev total WAL file size 1198331, number of live WAL files 2.
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867912) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(767KB)], [69(12MB)]
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045867945, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13677568, "oldest_snapshot_seqno": -1}
Dec 01 10:30:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6493 keys, 12313996 bytes, temperature: kUnknown
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045938448, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12313996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12273695, "index_size": 22975, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 171148, "raw_average_key_size": 26, "raw_value_size": 12159669, "raw_average_value_size": 1872, "num_data_blocks": 899, "num_entries": 6493, "num_filter_entries": 6493, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.938775) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12313996 bytes
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.940203) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.7 rd, 174.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.3 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(33.1) write-amplify(15.7) OK, records in: 7004, records dropped: 511 output_compression: NoCompression
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.940225) EVENT_LOG_v1 {"time_micros": 1764585045940214, "job": 42, "event": "compaction_finished", "compaction_time_micros": 70616, "compaction_time_cpu_micros": 25581, "output_level": 6, "num_output_files": 1, "total_output_size": 12313996, "num_input_records": 7004, "num_output_records": 6493, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045940670, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045943851, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.943991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.943999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944003) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:45 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:30:46 compute-2 ceph-mon[76053]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 531 B/s rd, 0 op/s
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:30:46 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:30:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:47 compute-2 podman[249508]: 2025-12-01 10:30:47.39939658 +0000 UTC m=+0.053137311 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 10:30:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:47.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:48 compute-2 ceph-mon[76053]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 531 B/s rd, 0 op/s
Dec 01 10:30:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:49 compute-2 sudo[249529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:30:49 compute-2 sudo[249529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:49 compute-2 sudo[249529]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:49.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:49 compute-2 ceph-mon[76053]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 531 B/s rd, 0 op/s
Dec 01 10:30:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:50 compute-2 podman[249556]: 2025-12-01 10:30:50.399263149 +0000 UTC m=+0.060880411 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 01 10:30:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:51.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:52 compute-2 ceph-mon[76053]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 797 B/s rd, 0 op/s
Dec 01 10:30:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:52 compute-2 sudo[249578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:30:52 compute-2 sudo[249578]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:30:52 compute-2 sudo[249578]: pam_unix(sudo:session): session closed for user root
Dec 01 10:30:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:53.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:53.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:53 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:30:53 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:30:53 compute-2 ceph-mon[76053]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 531 B/s rd, 0 op/s
Dec 01 10:30:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:30:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:55.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:55.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:30:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:56 compute-2 ceph-mon[76053]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 531 B/s rd, 0 op/s
Dec 01 10:30:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:57 compute-2 nova_compute[230216]: 2025-12-01 10:30:57.322 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:57 compute-2 nova_compute[230216]: 2025-12-01 10:30:57.323 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:30:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:30:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:57.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:30:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec 01 10:30:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec 01 10:30:57 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Dec 01 10:30:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:58 compute-2 ceph-mon[76053]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec 01 10:30:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:30:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:59.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:30:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:30:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:30:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:30:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:59.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:30:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:00 compute-2 ceph-mon[76053]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 300 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:01.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:01 compute-2 nova_compute[230216]: 2025-12-01 10:31:01.406 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:01 compute-2 nova_compute[230216]: 2025-12-01 10:31:01.406 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:31:01 compute-2 nova_compute[230216]: 2025-12-01 10:31:01.407 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:31:01 compute-2 podman[249611]: 2025-12-01 10:31:01.412641876 +0000 UTC m=+0.071342959 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 01 10:31:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:01 compute-2 nova_compute[230216]: 2025-12-01 10:31:01.509 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:31:01 compute-2 nova_compute[230216]: 2025-12-01 10:31:01.509 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:01.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:01 compute-2 ceph-mon[76053]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 0 B/s wr, 87 op/s
Dec 01 10:31:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:03.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:03.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:04 compute-2 ceph-mon[76053]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 0 B/s wr, 87 op/s
Dec 01 10:31:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:31:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:31:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:31:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:05.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:05.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:06 compute-2 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:06 compute-2 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:06 compute-2 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:06 compute-2 nova_compute[230216]: 2025-12-01 10:31:06.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:31:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:06 compute-2 ceph-mon[76053]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 53 KiB/s rd, 0 B/s wr, 87 op/s
Dec 01 10:31:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:07.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:31:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:31:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:31:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:31:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:07.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:31:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:31:08 compute-2 ceph-mon[76053]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 93 KiB/s rd, 0 B/s wr, 155 op/s
Dec 01 10:31:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:09.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:09 compute-2 nova_compute[230216]: 2025-12-01 10:31:09.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:09 compute-2 nova_compute[230216]: 2025-12-01 10:31:09.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:09 compute-2 nova_compute[230216]: 2025-12-01 10:31:09.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:31:09 compute-2 sudo[249645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:31:09 compute-2 sudo[249645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:09 compute-2 sudo[249645]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:31:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:09.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:31:09 compute-2 ceph-mon[76053]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 93 KiB/s rd, 0 B/s wr, 154 op/s
Dec 01 10:31:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:31:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.193 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:31:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:31:10 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3835383908' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.657 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.813 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.814 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.814 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.815 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:31:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:10 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3835383908' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:31:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:10 compute-2 nova_compute[230216]: 2025-12-01 10:31:10.957 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:31:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:11.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:31:11 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4156725637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:11 compute-2 nova_compute[230216]: 2025-12-01 10:31:11.390 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:31:11 compute-2 nova_compute[230216]: 2025-12-01 10:31:11.397 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:31:11 compute-2 nova_compute[230216]: 2025-12-01 10:31:11.419 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:31:11 compute-2 nova_compute[230216]: 2025-12-01 10:31:11.421 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:31:11 compute-2 nova_compute[230216]: 2025-12-01 10:31:11.421 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:31:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:11.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1853699093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:11 compute-2 ceph-mon[76053]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 93 KiB/s rd, 0 B/s wr, 155 op/s
Dec 01 10:31:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4156725637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:12 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3108768016' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:13.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:13.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:13 compute-2 ceph-mon[76053]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 67 op/s
Dec 01 10:31:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3390945076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:15.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:15.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:15 compute-2 ceph-mon[76053]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 67 op/s
Dec 01 10:31:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1827298230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:31:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:17.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:17.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:18 compute-2 podman[249724]: 2025-12-01 10:31:18.390422303 +0000 UTC m=+0.050045514 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:31:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:18 compute-2 ceph-mon[76053]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Dec 01 10:31:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:19 compute-2 ceph-mon[76053]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:19.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:21 compute-2 podman[249745]: 2025-12-01 10:31:21.483645571 +0000 UTC m=+0.054848059 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Dec 01 10:31:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:21.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:22 compute-2 ceph-mon[76053]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:31:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:23.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:23.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:25.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:25.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:25 compute-2 ceph-mon[76053]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:31:26 compute-2 ceph-mon[76053]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 734 B/s rd, 0 op/s
Dec 01 10:31:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:27.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:27.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:28 compute-2 ceph-mon[76053]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 734 B/s rd, 0 op/s
Dec 01 10:31:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:29.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:29 compute-2 sudo[249775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:31:29 compute-2 sudo[249775]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:29 compute-2 sudo[249775]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:29.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:31.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:31 compute-2 ceph-mon[76053]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 489 B/s rd, 0 op/s
Dec 01 10:31:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:31.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:32 compute-2 podman[249806]: 2025-12-01 10:31:32.423365925 +0000 UTC m=+0.080524029 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 01 10:31:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:32 compute-2 sshd-session[249802]: Invalid user alma from 45.78.219.119 port 40192
Dec 01 10:31:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:33.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:33 compute-2 ceph-mon[76053]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 734 B/s rd, 0 op/s
Dec 01 10:31:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:33.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:34 compute-2 sshd-session[249802]: Received disconnect from 45.78.219.119 port 40192:11: Bye Bye [preauth]
Dec 01 10:31:34 compute-2 sshd-session[249802]: Disconnected from invalid user alma 45.78.219.119 port 40192 [preauth]
Dec 01 10:31:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:35.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:35 compute-2 ceph-mon[76053]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 489 B/s rd, 0 op/s
Dec 01 10:31:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:35.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:36 compute-2 ceph-mon[76053]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 734 B/s rd, 0 op/s
Dec 01 10:31:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:37.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:37.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:39 compute-2 ceph-mon[76053]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:39.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:39.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:31:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:41.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:41.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:41 compute-2 ceph-mon[76053]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:43 compute-2 ceph-mon[76053]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:31:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:43.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:43.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:45 compute-2 ceph-mon[76053]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:45.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:45.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:47.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:47 compute-2 ceph-mon[76053]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:31:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:47.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:49.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:49 compute-2 ceph-mon[76053]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:49 compute-2 sudo[249850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:31:49 compute-2 sudo[249850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:49 compute-2 sudo[249850]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:49 compute-2 podman[249849]: 2025-12-01 10:31:49.405752944 +0000 UTC m=+0.059192785 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 01 10:31:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:49.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:51.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:51 compute-2 ceph-mon[76053]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:31:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:51.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:52 compute-2 podman[249898]: 2025-12-01 10:31:52.393536632 +0000 UTC m=+0.053605038 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:31:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:52 compute-2 sudo[249918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:31:52 compute-2 sudo[249918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:53 compute-2 sudo[249918]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:53 compute-2 sudo[249943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:31:53 compute-2 sudo[249943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:53 compute-2 ceph-mon[76053]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:31:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:53 compute-2 sudo[249943]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:31:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:53.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:31:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:31:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:31:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:55.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:31:55 compute-2 ceph-mon[76053]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Dec 01 10:31:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:31:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:56 compute-2 ceph-mon[76053]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 770 B/s rd, 0 op/s
Dec 01 10:31:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:31:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:57.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:31:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:31:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:57.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:31:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:58 compute-2 ceph-mon[76053]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Dec 01 10:31:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:59.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:31:59 compute-2 sudo[250007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:31:59 compute-2 sudo[250007]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:31:59 compute-2 sudo[250007]: pam_unix(sudo:session): session closed for user root
Dec 01 10:31:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:31:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:31:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:59.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:31:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:00 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:32:00 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:32:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:01.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:01 compute-2 ceph-mon[76053]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Dec 01 10:32:01 compute-2 nova_compute[230216]: 2025-12-01 10:32:01.421 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:02 compute-2 nova_compute[230216]: 2025-12-01 10:32:02.759 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:02 compute-2 nova_compute[230216]: 2025-12-01 10:32:02.760 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:03.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:03 compute-2 nova_compute[230216]: 2025-12-01 10:32:03.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:03 compute-2 nova_compute[230216]: 2025-12-01 10:32:03.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:32:03 compute-2 nova_compute[230216]: 2025-12-01 10:32:03.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:32:03 compute-2 nova_compute[230216]: 2025-12-01 10:32:03.268 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:32:03 compute-2 ceph-mon[76053]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 770 B/s rd, 0 op/s
Dec 01 10:32:03 compute-2 podman[250034]: 2025-12-01 10:32:03.434506514 +0000 UTC m=+0.091348895 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 01 10:32:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:32:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:03.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:32:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.720 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:32:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:32:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:32:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:05.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:05 compute-2 ceph-mon[76053]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Dec 01 10:32:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:05.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:06 compute-2 nova_compute[230216]: 2025-12-01 10:32:06.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:06 compute-2 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:06 compute-2 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:06 compute-2 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:32:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:07 compute-2 ceph-mon[76053]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/137364324' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:32:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/137364324' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:32:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:07.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:09.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:09 compute-2 nova_compute[230216]: 2025-12-01 10:32:09.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:09 compute-2 sudo[250070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:32:09 compute-2 sudo[250070]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:32:09 compute-2 sudo[250070]: pam_unix(sudo:session): session closed for user root
Dec 01 10:32:09 compute-2 ceph-mon[76053]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:32:10 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:11.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.289 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.289 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:32:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:11 compute-2 ceph-mon[76053]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/528854548' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:11 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1989639442' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:11.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:32:11 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1881486695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.772 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:32:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.942 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5172MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:32:11 compute-2 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.037 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.037 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.064 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:32:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:32:12 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/679368349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.499 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.505 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.528 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.530 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:32:12 compute-2 nova_compute[230216]: 2025-12-01 10:32:12.531 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:32:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1881486695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:13 compute-2 ceph-mon[76053]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/679368349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:13.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:15 compute-2 ceph-mon[76053]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:15.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:15.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/568064490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:17 compute-2 ceph-mon[76053]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2213763123' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:32:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:32:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:17.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:32:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:32:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:17.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:32:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:19 compute-2 ceph-mon[76053]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:19.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:19.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:20 compute-2 ceph-mon[76053]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:20 compute-2 podman[250149]: 2025-12-01 10:32:20.62268546 +0000 UTC m=+0.049651660 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:32:20 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:21.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:21.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:22 compute-2 ceph-mon[76053]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:23.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:23 compute-2 podman[250169]: 2025-12-01 10:32:23.391001896 +0000 UTC m=+0.054641753 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 01 10:32:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:25 compute-2 ceph-mon[76053]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:32:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:25.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:25 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:27.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:27 compute-2 ceph-mon[76053]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:27.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:29.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:29 compute-2 ceph-mon[76053]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:29 compute-2 sudo[250198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:32:29 compute-2 sudo[250198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:32:29 compute-2 sudo[250198]: pam_unix(sudo:session): session closed for user root
Dec 01 10:32:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:29.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:30 compute-2 ceph-mon[76053]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:30 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:31.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:31.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:33.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:33 compute-2 ceph-mon[76053]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:33.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:34 compute-2 podman[250227]: 2025-12-01 10:32:34.422810983 +0000 UTC m=+0.076506929 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 01 10:32:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:35 compute-2 ceph-mon[76053]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:35.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:35 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:36 compute-2 ceph-mon[76053]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:37.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:37.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:39 compute-2 ceph-mon[76053]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:39.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:39.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:32:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:40 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:41 compute-2 ceph-mon[76053]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:41.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:43.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:43.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:43 compute-2 ceph-mon[76053]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:44 compute-2 ceph-mon[76053]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:45.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:45 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:46 compute-2 nova_compute[230216]: 2025-12-01 10:32:46.348 230220 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:32:46 compute-2 nova_compute[230216]: 2025-12-01 10:32:46.390 230220 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:32:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:46 compute-2 ceph-mon[76053]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:47.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:48 compute-2 ceph-mon[76053]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:49.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:49 compute-2 sudo[250270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:32:49 compute-2 sudo[250270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:32:49 compute-2 sudo[250270]: pam_unix(sudo:session): session closed for user root
Dec 01 10:32:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:49.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:50 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:50.521 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 01 10:32:50 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:50.522 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 01 10:32:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:51 compute-2 ceph-mon[76053]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:51.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:51 compute-2 podman[250295]: 2025-12-01 10:32:51.420382663 +0000 UTC m=+0.075955676 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:32:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:51 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:32:51.524 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 01 10:32:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:53 compute-2 ceph-mon[76053]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:32:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:32:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:54 compute-2 podman[250318]: 2025-12-01 10:32:54.393409699 +0000 UTC m=+0.056293535 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 01 10:32:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:55.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:55.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:56 compute-2 ceph-mon[76053]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:56 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:32:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:32:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:57 compute-2 ceph-mon[76053]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:32:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:57.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:57.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:59 compute-2 ceph-mon[76053]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:32:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:32:59 compute-2 sudo[250344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:32:59 compute-2 sudo[250344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:32:59 compute-2 sudo[250344]: pam_unix(sudo:session): session closed for user root
Dec 01 10:32:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:32:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:32:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:59.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:32:59 compute-2 sudo[250369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:32:59 compute-2 sudo[250369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:32:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:00 compute-2 sudo[250369]: pam_unix(sudo:session): session closed for user root
Dec 01 10:33:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:01 compute-2 ceph-mon[76053]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:33:01 compute-2 ceph-mon[76053]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:33:01 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:33:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:01.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:02 compute-2 nova_compute[230216]: 2025-12-01 10:33:02.533 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:02 compute-2 ceph-mon[76053]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:03 compute-2 nova_compute[230216]: 2025-12-01 10:33:03.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:03.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:04 compute-2 ceph-mon[76053]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.722 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:33:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.723 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:33:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.723 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:33:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:05 compute-2 nova_compute[230216]: 2025-12-01 10:33:05.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:05 compute-2 nova_compute[230216]: 2025-12-01 10:33:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:33:05 compute-2 nova_compute[230216]: 2025-12-01 10:33:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:33:05 compute-2 nova_compute[230216]: 2025-12-01 10:33:05.238 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:33:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:05.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:05 compute-2 podman[250431]: 2025-12-01 10:33:05.432416553 +0000 UTC m=+0.085994014 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:33:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:05.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:06 compute-2 nova_compute[230216]: 2025-12-01 10:33:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:06 compute-2 nova_compute[230216]: 2025-12-01 10:33:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:06 compute-2 nova_compute[230216]: 2025-12-01 10:33:06.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:06 compute-2 nova_compute[230216]: 2025-12-01 10:33:06.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:33:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:07.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:07.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:07 compute-2 ceph-mon[76053]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/457096563' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:33:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/457096563' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:33:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:08 compute-2 sudo[250462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:33:08 compute-2 sudo[250462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:33:08 compute-2 sudo[250462]: pam_unix(sudo:session): session closed for user root
Dec 01 10:33:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:09.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:09 compute-2 sudo[250489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:33:09 compute-2 sudo[250489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:33:09 compute-2 sudo[250489]: pam_unix(sudo:session): session closed for user root
Dec 01 10:33:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:10 compute-2 nova_compute[230216]: 2025-12-01 10:33:10.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.229 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:33:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:11.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:33:11 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2826343472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.696 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:33:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:11.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.864 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.865 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5147MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.866 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.866 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.934 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.934 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:33:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:11 compute-2 nova_compute[230216]: 2025-12-01 10:33:11.960 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:33:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:33:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:33:12 compute-2 ceph-mon[76053]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:12 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:33:12 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:33:12 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1804216692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:12 compute-2 nova_compute[230216]: 2025-12-01 10:33:12.464 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:33:12 compute-2 nova_compute[230216]: 2025-12-01 10:33:12.469 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:33:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:12 compute-2 nova_compute[230216]: 2025-12-01 10:33:12.500 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:33:12 compute-2 nova_compute[230216]: 2025-12-01 10:33:12.502 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:33:12 compute-2 nova_compute[230216]: 2025-12-01 10:33:12.502 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:33:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:13 compute-2 ceph-mon[76053]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 578 B/s rd, 0 op/s
Dec 01 10:33:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/530923376' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2826343472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/113769782' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:13 compute-2 ceph-mon[76053]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1804216692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:13.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:13.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:14 compute-2 nova_compute[230216]: 2025-12-01 10:33:14.503 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:33:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:15.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:17 compute-2 ceph-mon[76053]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:17.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3859021103' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3811083457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:33:18 compute-2 ceph-mon[76053]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:18 compute-2 ceph-mon[76053]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:19.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:20 compute-2 ceph-mon[76053]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:21.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:22 compute-2 podman[250570]: 2025-12-01 10:33:22.384105057 +0000 UTC m=+0.045304033 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:33:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:23.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:23 compute-2 ceph-mon[76053]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:23.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:24 compute-2 ceph-mon[76053]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:33:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:25.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:25 compute-2 podman[250591]: 2025-12-01 10:33:25.394667144 +0000 UTC m=+0.052575852 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 01 10:33:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:25.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:27 compute-2 ceph-mon[76053]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:27.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:29.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:29 compute-2 ceph-mon[76053]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:29.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:29 compute-2 sudo[250618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:33:29 compute-2 sudo[250618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:33:29 compute-2 sudo[250618]: pam_unix(sudo:session): session closed for user root
Dec 01 10:33:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.635953) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210635984, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1808, "num_deletes": 251, "total_data_size": 4859713, "memory_usage": 4930928, "flush_reason": "Manual Compaction"}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec 01 10:33:30 compute-2 ceph-mon[76053]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210726471, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3141933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37688, "largest_seqno": 39491, "table_properties": {"data_size": 3134277, "index_size": 4599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15807, "raw_average_key_size": 20, "raw_value_size": 3119073, "raw_average_value_size": 3993, "num_data_blocks": 200, "num_entries": 781, "num_filter_entries": 781, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585046, "oldest_key_time": 1764585046, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 90572 microseconds, and 6840 cpu microseconds.
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.726523) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3141933 bytes OK
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.726543) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750378) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750451) EVENT_LOG_v1 {"time_micros": 1764585210750440, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750485) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4851522, prev total WAL file size 4851522, number of live WAL files 2.
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.751927) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3068KB)], [72(11MB)]
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210752005, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15455929, "oldest_snapshot_seqno": -1}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6758 keys, 13342787 bytes, temperature: kUnknown
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210952102, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13342787, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13299888, "index_size": 24931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 177443, "raw_average_key_size": 26, "raw_value_size": 13180383, "raw_average_value_size": 1950, "num_data_blocks": 980, "num_entries": 6758, "num_filter_entries": 6758, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.952388) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13342787 bytes
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.957125) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.2 rd, 66.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 11.7 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(9.2) write-amplify(4.2) OK, records in: 7274, records dropped: 516 output_compression: NoCompression
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.957176) EVENT_LOG_v1 {"time_micros": 1764585210957160, "job": 44, "event": "compaction_finished", "compaction_time_micros": 200196, "compaction_time_cpu_micros": 26941, "output_level": 6, "num_output_files": 1, "total_output_size": 13342787, "num_input_records": 7274, "num_output_records": 6758, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:33:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210958226, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210960376, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.751761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:30 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:33:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:31.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:33.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:33 compute-2 ceph-mon[76053]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:33.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:34 compute-2 ceph-mon[76053]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:35.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:35.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:36 compute-2 podman[250649]: 2025-12-01 10:33:36.418989173 +0000 UTC m=+0.079964210 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 01 10:33:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:37.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:37 compute-2 ceph-mon[76053]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:37.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:38 compute-2 ceph-mon[76053]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:39.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:33:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:39.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:40 compute-2 ceph-mon[76053]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:41.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:41.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:42 compute-2 ceph-mon[76053]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:43.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:43.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:45.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:45 compute-2 ceph-mon[76053]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:45.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:47.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:47 compute-2 ceph-mon[76053]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:48 compute-2 ceph-mon[76053]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:49.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:50 compute-2 sudo[250689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:33:50 compute-2 sudo[250689]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:33:50 compute-2 sudo[250689]: pam_unix(sudo:session): session closed for user root
Dec 01 10:33:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:50 compute-2 ceph-mon[76053]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:51.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:53.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:53 compute-2 podman[250716]: 2025-12-01 10:33:53.396346631 +0000 UTC m=+0.049569885 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 01 10:33:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:53 compute-2 ceph-mon[76053]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:53.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:55.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:55 compute-2 ceph-mon[76053]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:33:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:33:56 compute-2 podman[250739]: 2025-12-01 10:33:56.398669962 +0000 UTC m=+0.062498603 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, container_name=multipathd)
Dec 01 10:33:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:56 compute-2 ceph-mon[76053]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:33:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:57.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:33:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:33:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:33:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:59.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:33:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:33:59 compute-2 ceph-mon[76053]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:33:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:33:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:33:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:33:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:01.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:01 compute-2 ceph-mon[76053]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:03 compute-2 nova_compute[230216]: 2025-12-01 10:34:03.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:03 compute-2 nova_compute[230216]: 2025-12-01 10:34:03.316 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:03.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:03 compute-2 ceph-mon[76053]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:04 compute-2 nova_compute[230216]: 2025-12-01 10:34:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:04 compute-2 ceph-mon[76053]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:34:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:34:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:34:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:05 compute-2 nova_compute[230216]: 2025-12-01 10:34:05.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:05 compute-2 nova_compute[230216]: 2025-12-01 10:34:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:34:05 compute-2 nova_compute[230216]: 2025-12-01 10:34:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:34:05 compute-2 nova_compute[230216]: 2025-12-01 10:34:05.225 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:34:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec 01 10:34:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:05.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec 01 10:34:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:06 compute-2 nova_compute[230216]: 2025-12-01 10:34:06.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 01 10:34:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:34:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 01 10:34:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:34:07 compute-2 nova_compute[230216]: 2025-12-01 10:34:07.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:07 compute-2 nova_compute[230216]: 2025-12-01 10:34:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:07 compute-2 nova_compute[230216]: 2025-12-01 10:34:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:34:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:07.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:07 compute-2 podman[250769]: 2025-12-01 10:34:07.421531425 +0000 UTC m=+0.079548221 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 01 10:34:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:07 compute-2 ceph-mon[76053]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:34:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:34:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:07.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:08 compute-2 sudo[250797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:34:08 compute-2 sudo[250797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:08 compute-2 sudo[250797]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:08 compute-2 sudo[250822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 01 10:34:08 compute-2 sudo[250822]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:09 compute-2 podman[250919]: 2025-12-01 10:34:09.286981452 +0000 UTC m=+0.055818509 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325)
Dec 01 10:34:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:09.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:09 compute-2 podman[250919]: 2025-12-01 10:34:09.425008354 +0000 UTC m=+0.193845391 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 01 10:34:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:09 compute-2 ceph-mon[76053]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 01 10:34:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:09.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:09 compute-2 podman[251041]: 2025-12-01 10:34:09.911677232 +0000 UTC m=+0.056853265 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:34:09 compute-2 podman[251041]: 2025-12-01 10:34:09.918498529 +0000 UTC m=+0.063674542 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 01 10:34:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:10 compute-2 sudo[251118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:34:10 compute-2 sudo[251118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:10 compute-2 sudo[251118]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:10 compute-2 podman[251203]: 2025-12-01 10:34:10.384010007 +0000 UTC m=+0.048219702 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:34:10 compute-2 podman[251203]: 2025-12-01 10:34:10.394916234 +0000 UTC m=+0.059125909 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec 01 10:34:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:34:10 compute-2 podman[251270]: 2025-12-01 10:34:10.579450217 +0000 UTC m=+0.048083510 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, description=keepalived for Ceph, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, name=keepalived, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vendor=Red Hat, Inc., version=2.2.4)
Dec 01 10:34:10 compute-2 podman[251270]: 2025-12-01 10:34:10.592962887 +0000 UTC m=+0.061596150 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, architecture=x86_64, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 01 10:34:10 compute-2 sudo[250822]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:10 compute-2 sudo[251339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:34:10 compute-2 sudo[251339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:10 compute-2 sudo[251339]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:10 compute-2 sudo[251364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:34:10 compute-2 sudo[251364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:11 compute-2 nova_compute[230216]: 2025-12-01 10:34:11.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:11.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:11 compute-2 sudo[251364]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:11 compute-2 ceph-mon[76053]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:34:11 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:34:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:11.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:12 compute-2 ceph-mon[76053]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 837 B/s rd, 0 op/s
Dec 01 10:34:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.213 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.214 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:34:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:13.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.544 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.544 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:34:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4228211347' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:13 compute-2 ceph-mon[76053]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 558 B/s rd, 0 op/s
Dec 01 10:34:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:34:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2262681742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:13 compute-2 nova_compute[230216]: 2025-12-01 10:34:13.993 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.139 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.140 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5157MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.140 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.141 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.206 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.206 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.226 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:34:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:34:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/565985170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2262681742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1574029017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.657 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.663 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.681 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.682 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:34:14 compute-2 nova_compute[230216]: 2025-12-01 10:34:14.683 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:34:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:15.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/565985170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:15 compute-2 ceph-mon[76053]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 558 B/s rd, 0 op/s
Dec 01 10:34:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:15.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:16 compute-2 sudo[251472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:34:16 compute-2 sudo[251472]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:16 compute-2 sudo[251472]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:17 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:17 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:34:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/938820539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:17 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/784333000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:34:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:17.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:18 compute-2 ceph-mon[76053]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 558 B/s rd, 0 op/s
Dec 01 10:34:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:19.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:19 compute-2 ceph-mon[76053]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 558 B/s rd, 0 op/s
Dec 01 10:34:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:19.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:21.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:21.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:22 compute-2 ceph-mon[76053]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 837 B/s rd, 0 op/s
Dec 01 10:34:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:23.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:24 compute-2 podman[251505]: 2025-12-01 10:34:24.395524454 +0000 UTC m=+0.052436475 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 01 10:34:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:24 compute-2 ceph-mon[76053]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:25.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:25 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:34:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:26 compute-2 ceph-mon[76053]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:27.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:27 compute-2 podman[251527]: 2025-12-01 10:34:27.445946172 +0000 UTC m=+0.105147317 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 10:34:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:27 compute-2 ceph-mon[76053]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:27.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:29.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:30 compute-2 sudo[251553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:34:30 compute-2 sudo[251553]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:30 compute-2 sudo[251553]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:30 compute-2 ceph-mon[76053]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:31.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:32 compute-2 ceph-mon[76053]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:33.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:33 compute-2 ceph-mon[76053]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:33.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:35.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:36 compute-2 ceph-mon[76053]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:37.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:37 compute-2 ceph-mon[76053]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:37.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:38 compute-2 podman[251586]: 2025-12-01 10:34:38.414517784 +0000 UTC m=+0.076848474 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 10:34:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:39.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:39 compute-2 ceph-mon[76053]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:34:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:34:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:34:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:41.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:42 compute-2 ceph-mon[76053]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:43.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:43 compute-2 ceph-mon[76053]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:43.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:34:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:45.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:34:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:45 compute-2 ceph-mon[76053]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:47.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:47.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:48 compute-2 ceph-mon[76053]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:49.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:49.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:50 compute-2 sudo[251626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:34:50 compute-2 sudo[251626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:34:50 compute-2 sudo[251626]: pam_unix(sudo:session): session closed for user root
Dec 01 10:34:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:50 compute-2 ceph-mon[76053]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:51.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:51.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:52 compute-2 ceph-mon[76053]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:34:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:53.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:34:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:53 compute-2 ceph-mon[76053]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:53.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:34:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:34:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:55.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:34:55 compute-2 podman[251655]: 2025-12-01 10:34:55.418353602 +0000 UTC m=+0.078989827 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 01 10:34:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:55 compute-2 ceph-mon[76053]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.898298) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295898350, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1100, "num_deletes": 255, "total_data_size": 2625117, "memory_usage": 2677264, "flush_reason": "Manual Compaction"}
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec 01 10:34:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:55.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295911216, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1718985, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39496, "largest_seqno": 40591, "table_properties": {"data_size": 1713999, "index_size": 2510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10753, "raw_average_key_size": 19, "raw_value_size": 1703981, "raw_average_value_size": 3109, "num_data_blocks": 108, "num_entries": 548, "num_filter_entries": 548, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585212, "oldest_key_time": 1764585212, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 12959 microseconds, and 4936 cpu microseconds.
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911257) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1718985 bytes OK
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911273) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912738) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912750) EVENT_LOG_v1 {"time_micros": 1764585295912746, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912773) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2619742, prev total WAL file size 2619742, number of live WAL files 2.
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.913384) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1678KB)], [75(12MB)]
Dec 01 10:34:55 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295913454, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15061772, "oldest_snapshot_seqno": -1}
Dec 01 10:34:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6778 keys, 14900169 bytes, temperature: kUnknown
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296000839, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14900169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14855324, "index_size": 26813, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 178785, "raw_average_key_size": 26, "raw_value_size": 14733636, "raw_average_value_size": 2173, "num_data_blocks": 1057, "num_entries": 6778, "num_filter_entries": 6778, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.001288) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14900169 bytes
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.002531) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.2 rd, 170.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.7 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(17.4) write-amplify(8.7) OK, records in: 7306, records dropped: 528 output_compression: NoCompression
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.002549) EVENT_LOG_v1 {"time_micros": 1764585296002541, "job": 46, "event": "compaction_finished", "compaction_time_micros": 87446, "compaction_time_cpu_micros": 36244, "output_level": 6, "num_output_files": 1, "total_output_size": 14900169, "num_input_records": 7306, "num_output_records": 6778, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296003056, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296005379, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.913307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005442) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:34:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:34:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:57.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:58 compute-2 podman[251678]: 2025-12-01 10:34:58.38930239 +0000 UTC m=+0.047810342 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 01 10:34:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:58 compute-2 ceph-mon[76053]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:34:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:59.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:34:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:34:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:34:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:59.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:34:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:00 compute-2 ceph-mon[76053]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:01.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:01 compute-2 ceph-mon[76053]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:01.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:03.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:03.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:04 compute-2 ceph-mon[76053]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.725 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:35:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:35:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:35:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:05.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:05 compute-2 ceph-mon[76053]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:05.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:06 compute-2 nova_compute[230216]: 2025-12-01 10:35:06.676 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:06 compute-2 nova_compute[230216]: 2025-12-01 10:35:06.677 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:06 compute-2 nova_compute[230216]: 2025-12-01 10:35:06.677 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:07 compute-2 nova_compute[230216]: 2025-12-01 10:35:07.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 01 10:35:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/530935708' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:35:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/530935708' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:35:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:07.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:08 compute-2 nova_compute[230216]: 2025-12-01 10:35:08.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:08 compute-2 nova_compute[230216]: 2025-12-01 10:35:08.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:35:08 compute-2 nova_compute[230216]: 2025-12-01 10:35:08.221 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:08 compute-2 nova_compute[230216]: 2025-12-01 10:35:08.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 01 10:35:08 compute-2 nova_compute[230216]: 2025-12-01 10:35:08.238 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 01 10:35:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:08 compute-2 ceph-mon[76053]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:09.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:09 compute-2 podman[251709]: 2025-12-01 10:35:09.421633593 +0000 UTC m=+0.083083608 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 01 10:35:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:09 compute-2 ceph-mon[76053]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:35:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:10 compute-2 sudo[251737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:35:10 compute-2 sudo[251737]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:10 compute-2 sudo[251737]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:11 compute-2 nova_compute[230216]: 2025-12-01 10:35:11.225 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:11.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:11.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:12 compute-2 ceph-mon[76053]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:35:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:13.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1567705588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:13 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:35:13 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731441940' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.793 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:35:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:13.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.960 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5157MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:35:13 compute-2 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:35:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.164 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.165 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.285 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.411 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.411 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.430 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.449 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.466 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:35:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:14 compute-2 ceph-mon[76053]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1731441940' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3185267231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:35:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731946058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.907 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.913 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:35:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.997 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.999 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:35:14 compute-2 nova_compute[230216]: 2025-12-01 10:35:14.999 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:35:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:15.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:15.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:16 compute-2 nova_compute[230216]: 2025-12-01 10:35:15.999 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/731946058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:16 compute-2 sudo[251812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:35:16 compute-2 sudo[251812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:16 compute-2 sudo[251812]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:16 compute-2 sudo[251837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:35:16 compute-2 sudo[251837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:17 compute-2 sudo[251837]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:17.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:17 compute-2 ceph-mon[76053]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:35:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:17.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:35:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:19 compute-2 ceph-mon[76053]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:35:19 compute-2 ceph-mon[76053]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Dec 01 10:35:19 compute-2 ceph-mon[76053]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:35:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3129889998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:19.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:19.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1569583609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:35:20 compute-2 ceph-mon[76053]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Dec 01 10:35:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:21.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:22 compute-2 ceph-mon[76053]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:23 compute-2 nova_compute[230216]: 2025-12-01 10:35:23.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:35:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:23.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:23 compute-2 ceph-mon[76053]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:23.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:35:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:25 compute-2 sudo[251900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:35:25 compute-2 sudo[251900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:25 compute-2 sudo[251900]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:25.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:35:26 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:35:26 compute-2 ceph-mon[76053]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Dec 01 10:35:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:26 compute-2 podman[251927]: 2025-12-01 10:35:26.390360973 +0000 UTC m=+0.048256564 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 01 10:35:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:27.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:27 compute-2 ceph-mon[76053]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Dec 01 10:35:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:29 compute-2 podman[251948]: 2025-12-01 10:35:29.404997174 +0000 UTC m=+0.062175564 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 01 10:35:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:29.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:29.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:30 compute-2 sudo[251970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:35:30 compute-2 sudo[251970]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:30 compute-2 sudo[251970]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:30 compute-2 ceph-mon[76053]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:30 compute-2 sshd-session[251995]: error: kex_exchange_identification: read: Connection reset by peer
Dec 01 10:35:30 compute-2 sshd-session[251995]: Connection reset by 69.164.217.245 port 33466
Dec 01 10:35:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:31.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:31 compute-2 ceph-mon[76053]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:31.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:33.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:33 compute-2 ceph-mon[76053]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:33.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:35 compute-2 ceph-mon[76053]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:35.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:37.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:37 compute-2 ceph-mon[76053]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:37.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:35:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:39.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:35:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:39.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:40 compute-2 podman[252006]: 2025-12-01 10:35:40.464960711 +0000 UTC m=+0.123382270 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:35:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:40 compute-2 ceph-mon[76053]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:35:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:41.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:41.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:42 compute-2 ceph-mon[76053]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:43.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:43 compute-2 ceph-mon[76053]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:43.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:45.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:45.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:46 compute-2 ceph-mon[76053]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:47.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:47.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:48 compute-2 ceph-mon[76053]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:35:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:49.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:35:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:49 compute-2 ceph-mon[76053]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:49.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:50 compute-2 sudo[252043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:35:50 compute-2 sudo[252043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:35:50 compute-2 sudo[252043]: pam_unix(sudo:session): session closed for user root
Dec 01 10:35:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:51.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:51.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:52 compute-2 ceph-mon[76053]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:53.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:53 compute-2 ceph-mon[76053]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:53.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:35:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:55.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:55 compute-2 ceph-mon[76053]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:35:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:55.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:35:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:57 compute-2 podman[252074]: 2025-12-01 10:35:57.392426808 +0000 UTC m=+0.049385024 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 01 10:35:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:57.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:57 compute-2 ceph-mon[76053]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:35:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:35:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:35:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:35:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:35:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:35:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:35:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:00 compute-2 podman[252098]: 2025-12-01 10:36:00.392477077 +0000 UTC m=+0.052415579 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:36:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:00 compute-2 ceph-mon[76053]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:01.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:01 compute-2 ceph-mon[76053]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:02.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:03 compute-2 nova_compute[230216]: 2025-12-01 10:36:03.247 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:03.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:03 compute-2 ceph-mon[76053]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:04.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:36:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.727 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:36:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.727 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:36:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:05 compute-2 nova_compute[230216]: 2025-12-01 10:36:05.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:05.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:06.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:07 compute-2 ceph-mon[76053]: pgmap v1370: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:07.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:08.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:08 compute-2 nova_compute[230216]: 2025-12-01 10:36:08.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:08 compute-2 nova_compute[230216]: 2025-12-01 10:36:08.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:08 compute-2 nova_compute[230216]: 2025-12-01 10:36:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:08 compute-2 nova_compute[230216]: 2025-12-01 10:36:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:36:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3551098333' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:36:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/3551098333' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:36:08 compute-2 ceph-mon[76053]: pgmap v1371: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:09 compute-2 nova_compute[230216]: 2025-12-01 10:36:09.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:09 compute-2 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:09 compute-2 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:36:09 compute-2 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:36:09 compute-2 nova_compute[230216]: 2025-12-01 10:36:09.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:36:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:09.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:09 compute-2 ceph-mon[76053]: pgmap v1372: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:09 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:36:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:10.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:10 compute-2 sudo[252129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:36:10 compute-2 sudo[252129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:10 compute-2 sudo[252129]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:10 compute-2 podman[252153]: 2025-12-01 10:36:10.721062962 +0000 UTC m=+0.070164574 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 01 10:36:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:11 compute-2 nova_compute[230216]: 2025-12-01 10:36:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:11.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:12.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:12 compute-2 ceph-mon[76053]: pgmap v1373: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:13 compute-2 ceph-mon[76053]: pgmap v1374: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3478462913' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:14.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:36:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:36:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1011575640' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1357080478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1011575640' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.651 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.884 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5158MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:36:14 compute-2 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:36:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.244 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:36:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:15.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:36:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2640763138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:15 compute-2 ceph-mon[76053]: pgmap v1375: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.681 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.688 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.704 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.706 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:36:15 compute-2 nova_compute[230216]: 2025-12-01 10:36:15.706 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:36:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:16.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:16 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2640763138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:17.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:17 compute-2 ceph-mon[76053]: pgmap v1376: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:17 compute-2 nova_compute[230216]: 2025-12-01 10:36:17.707 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:36:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:18.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:18 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3320979075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:19.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/309041401' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:36:19 compute-2 ceph-mon[76053]: pgmap v1377: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:20.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:21.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:21 compute-2 ceph-mon[76053]: pgmap v1378: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:36:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:22.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:36:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:23.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:24.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:24 compute-2 ceph-mon[76053]: pgmap v1379: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:36:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:25 compute-2 sudo[252241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:36:25 compute-2 sudo[252241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:25 compute-2 sudo[252241]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:25.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:25 compute-2 sudo[252266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:36:25 compute-2 sudo[252266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:25 compute-2 ceph-mon[76053]: pgmap v1380: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:26 compute-2 sudo[252266]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:26.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:36:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:36:28 compute-2 podman[252323]: 2025-12-01 10:36:28.395243355 +0000 UTC m=+0.049489677 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 01 10:36:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:28 compute-2 ceph-mon[76053]: pgmap v1381: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:29 compute-2 ceph-mon[76053]: pgmap v1382: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:30.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:30 compute-2 sudo[252344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:36:30 compute-2 sudo[252344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:30 compute-2 sudo[252344]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:30 compute-2 podman[252368]: 2025-12-01 10:36:30.808489233 +0000 UTC m=+0.058644571 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:36:30 compute-2 ceph-mon[76053]: pgmap v1383: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 557 B/s rd, 0 op/s
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:36:30 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:36:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:32.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:32 compute-2 ceph-mon[76053]: pgmap v1384: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 557 B/s rd, 0 op/s
Dec 01 10:36:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:34.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:34 compute-2 ceph-mon[76053]: pgmap v1385: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 557 B/s rd, 0 op/s
Dec 01 10:36:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:35.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:37 compute-2 ceph-mon[76053]: pgmap v1386: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 836 B/s rd, 0 op/s
Dec 01 10:36:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:37.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:37 compute-2 sudo[252398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:36:37 compute-2 sudo[252398]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:37 compute-2 sudo[252398]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:38.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:36:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:39.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:39 compute-2 ceph-mon[76053]: pgmap v1387: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 557 B/s rd, 0 op/s
Dec 01 10:36:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:40 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:36:40 compute-2 ceph-mon[76053]: pgmap v1388: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 557 B/s rd, 0 op/s
Dec 01 10:36:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:41 compute-2 podman[252425]: 2025-12-01 10:36:41.419386632 +0000 UTC m=+0.078184831 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:36:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:41.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:42.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:42 compute-2 ceph-mon[76053]: pgmap v1389: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:43.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:44.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:44 compute-2 ceph-mon[76053]: pgmap v1390: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:45.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:46.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:46 compute-2 ceph-mon[76053]: pgmap v1391: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:47.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:48 compute-2 ceph-mon[76053]: pgmap v1392: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:49.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:50 compute-2 ceph-mon[76053]: pgmap v1393: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:50 compute-2 sudo[252461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:36:50 compute-2 sudo[252461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:36:50 compute-2 sudo[252461]: pam_unix(sudo:session): session closed for user root
Dec 01 10:36:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:51.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:52.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:52 compute-2 ceph-mon[76053]: pgmap v1394: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:53.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:55 compute-2 ceph-mon[76053]: pgmap v1395: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:36:55 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:36:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:56.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:36:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:56 compute-2 ceph-mon[76053]: pgmap v1396: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:36:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:36:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:57.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:36:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:58.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:59 compute-2 podman[252494]: 2025-12-01 10:36:59.392479134 +0000 UTC m=+0.054335695 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 01 10:36:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:36:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:36:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:59.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:36:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:36:59 compute-2 ceph-mon[76053]: pgmap v1397: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:00.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:01 compute-2 ceph-mon[76053]: pgmap v1398: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:01 compute-2 podman[252515]: 2025-12-01 10:37:01.39141784 +0000 UTC m=+0.049553447 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 01 10:37:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:02.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:02 compute-2 ceph-mon[76053]: pgmap v1399: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:04.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:04 compute-2 ceph-mon[76053]: pgmap v1400: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.729 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:37:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.730 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:37:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:37:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:05.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:06.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.776714) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426776832, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1510, "num_deletes": 251, "total_data_size": 3929750, "memory_usage": 3990120, "flush_reason": "Manual Compaction"}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426789725, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2533730, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40596, "largest_seqno": 42101, "table_properties": {"data_size": 2527310, "index_size": 3619, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13565, "raw_average_key_size": 20, "raw_value_size": 2514458, "raw_average_value_size": 3714, "num_data_blocks": 158, "num_entries": 677, "num_filter_entries": 677, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585297, "oldest_key_time": 1764585297, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 13032 microseconds, and 5560 cpu microseconds.
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.789769) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2533730 bytes OK
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.789784) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791624) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791634) EVENT_LOG_v1 {"time_micros": 1764585426791631, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791650) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3922757, prev total WAL file size 3922757, number of live WAL files 2.
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.792532) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2474KB)], [78(14MB)]
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426792583, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17433899, "oldest_snapshot_seqno": -1}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6939 keys, 15161056 bytes, temperature: kUnknown
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426872310, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 15161056, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15115380, "index_size": 27196, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182838, "raw_average_key_size": 26, "raw_value_size": 14991094, "raw_average_value_size": 2160, "num_data_blocks": 1068, "num_entries": 6939, "num_filter_entries": 6939, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.872585) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 15161056 bytes
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.873856) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 218.4 rd, 190.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 14.2 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(12.9) write-amplify(6.0) OK, records in: 7455, records dropped: 516 output_compression: NoCompression
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.873879) EVENT_LOG_v1 {"time_micros": 1764585426873869, "job": 48, "event": "compaction_finished", "compaction_time_micros": 79808, "compaction_time_cpu_micros": 32195, "output_level": 6, "num_output_files": 1, "total_output_size": 15161056, "num_input_records": 7455, "num_output_records": 6939, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426874615, "job": 48, "event": "table_file_deletion", "file_number": 80}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426878099, "job": 48, "event": "table_file_deletion", "file_number": 78}
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.792435) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:06 compute-2 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 01 10:37:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:07 compute-2 ceph-mon[76053]: pgmap v1401: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:07 compute-2 nova_compute[230216]: 2025-12-01 10:37:07.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:07.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/4194406903' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:37:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/4194406903' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:37:08 compute-2 nova_compute[230216]: 2025-12-01 10:37:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:08 compute-2 nova_compute[230216]: 2025-12-01 10:37:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:37:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:09 compute-2 ceph-mon[76053]: pgmap v1402: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:09 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:09 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:09 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:09.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:09 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:10 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:10 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:10 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:10.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:10 compute-2 nova_compute[230216]: 2025-12-01 10:37:10.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:10 compute-2 nova_compute[230216]: 2025-12-01 10:37:10.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:10 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:37:10 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:10 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:10 compute-2 sudo[252545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:37:10 compute-2 sudo[252545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:10 compute-2 sudo[252545]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:11 compute-2 nova_compute[230216]: 2025-12-01 10:37:11.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:11 compute-2 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:11 compute-2 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 01 10:37:11 compute-2 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 01 10:37:11 compute-2 nova_compute[230216]: 2025-12-01 10:37:11.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 01 10:37:11 compute-2 ceph-mon[76053]: pgmap v1403: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:11 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:11 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:11 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:11 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:11 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:11 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:12 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:12 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:12 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:12.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:12 compute-2 nova_compute[230216]: 2025-12-01 10:37:12.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:12 compute-2 podman[252572]: 2025-12-01 10:37:12.441646717 +0000 UTC m=+0.104645461 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 01 10:37:12 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:12 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:12 compute-2 ceph-mon[76053]: pgmap v1404: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:13 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:13 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:13 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:13.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:13 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:13 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:13 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/806659210' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:14 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:14 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:14 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:14.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:37:14 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:14 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:14 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:37:14 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1803103579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.683 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.835 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.836 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5159MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.836 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.837 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.909 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 01 10:37:14 compute-2 nova_compute[230216]: 2025-12-01 10:37:14.930 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 01 10:37:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1073295087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:14 compute-2 ceph-mon[76053]: pgmap v1405: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:14 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1803103579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:15 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 01 10:37:15 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2223600172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:15 compute-2 nova_compute[230216]: 2025-12-01 10:37:15.396 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 01 10:37:15 compute-2 nova_compute[230216]: 2025-12-01 10:37:15.401 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 01 10:37:15 compute-2 nova_compute[230216]: 2025-12-01 10:37:15.421 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 01 10:37:15 compute-2 nova_compute[230216]: 2025-12-01 10:37:15.423 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 01 10:37:15 compute-2 nova_compute[230216]: 2025-12-01 10:37:15.423 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:37:15 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:15 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:15 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:15.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:15 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:15 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:15 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2223600172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:16 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:16 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:16 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:16.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:16 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:16 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:16 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:16 compute-2 ceph-mon[76053]: pgmap v1406: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:17 compute-2 nova_compute[230216]: 2025-12-01 10:37:17.424 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:37:17 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:17 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:17 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:17.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:17 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:17 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:18 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:18 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:18 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:18.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:18 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:18 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:18 compute-2 ceph-mon[76053]: pgmap v1407: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:19 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:19 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:19 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:19.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:19 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:19 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:19 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2572138056' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:20 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:20 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:20 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:20.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:20 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:20 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:20 compute-2 ceph-mon[76053]: pgmap v1408: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:20 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3619192831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 01 10:37:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:21 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:21 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:21 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:21 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:21.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:21 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:21 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:22 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:22 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:22 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:22.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:22 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:22 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:22 compute-2 ceph-mon[76053]: pgmap v1409: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:23 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:23 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000071s ======
Dec 01 10:37:23 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:23.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000071s
Dec 01 10:37:23 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:23 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:24 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:24 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:24 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:24.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:24 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:24 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:24 compute-2 ceph-mon[76053]: pgmap v1410: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:24 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:37:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:25 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:25 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:25 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:25.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:25 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:25 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:26 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:26 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:26 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:26.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:26 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:26 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:26 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:26 compute-2 ceph-mon[76053]: pgmap v1411: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:27 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:27 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:27 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:27.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:27 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:27 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:28 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:28 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:28 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:28.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:28 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:28 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:28 compute-2 ceph-mon[76053]: pgmap v1412: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:29 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:29 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:29 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:29.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:29 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:29 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:30 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:30 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:30 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:30 compute-2 podman[252661]: 2025-12-01 10:37:30.39764576 +0000 UTC m=+0.058235231 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 01 10:37:30 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:30 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:30 compute-2 ceph-mon[76053]: pgmap v1413: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:31 compute-2 sudo[252682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:37:31 compute-2 sudo[252682]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:31 compute-2 sudo[252682]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:31 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:31 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:31 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:31 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:31 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:31 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:31.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:32 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:32 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:32 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:32.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:32 compute-2 podman[252709]: 2025-12-01 10:37:32.390438694 +0000 UTC m=+0.051264720 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 01 10:37:32 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:32 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:32 compute-2 ceph-mon[76053]: pgmap v1414: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:33 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:33 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:33 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:33 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:33 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:33.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:34 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:34 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:34 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:34.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:34 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:34 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:34 compute-2 ceph-mon[76053]: pgmap v1415: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:35 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:35 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:35 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:35 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:35 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:35.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:36 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:36 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:36 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:36.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:36 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:36 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:36 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:36 compute-2 ceph-mon[76053]: pgmap v1416: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:37 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:37 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:37 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:37 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:37 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:37.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:37 compute-2 sudo[252735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 01 10:37:37 compute-2 sudo[252735]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:37 compute-2 sudo[252735]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:37 compute-2 sudo[252760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 01 10:37:37 compute-2 sudo[252760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:38 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:38 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:38 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:38.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:38 compute-2 sudo[252760]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:38 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:38 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:38 compute-2 ceph-mon[76053]: pgmap v1417: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 01 10:37:38 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:39 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:39 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:39 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:39 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:39 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:39.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:39 compute-2 ceph-mon[76053]: pgmap v1418: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 610 B/s rd, 0 op/s
Dec 01 10:37:39 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:37:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:40 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:40 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:40 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:40.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:40 compute-2 sshd-session[252818]: Accepted publickey for zuul from 192.168.122.10 port 39866 ssh2: ECDSA SHA256:Ns7Z+ky9/Eij1snDgU2gfVod0cLz8CVquneqn/pUMpY
Dec 01 10:37:40 compute-2 systemd-logind[795]: New session 58 of user zuul.
Dec 01 10:37:40 compute-2 systemd[1]: Started Session 58 of User zuul.
Dec 01 10:37:40 compute-2 sshd-session[252818]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 01 10:37:40 compute-2 sudo[252822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 01 10:37:40 compute-2 sudo[252822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 01 10:37:40 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:40 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:41 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:41 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:41 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:41 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:41 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:41 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:41.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:41 compute-2 ceph-mon[76053]: pgmap v1419: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 915 B/s rd, 0 op/s
Dec 01 10:37:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:42 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:42 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:42 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:42.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:42 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:42 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:43 compute-2 podman[253030]: 2025-12-01 10:37:43.43575418 +0000 UTC m=+0.089717245 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 01 10:37:43 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:43 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:43 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:43 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:43 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:43 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 01 10:37:43 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/840083907' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: pgmap v1420: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 610 B/s rd, 0 op/s
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.26776 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.17502 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.27320 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.26788 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.17511 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.27326 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/840083907' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1987919371' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:37:44 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1668722920' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 01 10:37:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:44 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:44 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:44 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:44.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:44 compute-2 sudo[253115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 01 10:37:44 compute-2 sudo[253115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:44 compute-2 sudo[253115]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:44 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:44 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:37:45 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec 01 10:37:45 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:45 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:45 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:45 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:45 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:45.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:46 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:46 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:46 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:46.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:46 compute-2 ceph-mon[76053]: pgmap v1421: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 610 B/s rd, 0 op/s
Dec 01 10:37:46 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:46 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:46 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:46 compute-2 ovs-vsctl[253222]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 01 10:37:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:47 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:47 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:47 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:47 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:47 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:47.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:47 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 01 10:37:47 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 01 10:37:47 compute-2 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 01 10:37:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:48 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:48 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:48 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:48.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:48 compute-2 ceph-mon[76053]: pgmap v1422: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 610 B/s rd, 0 op/s
Dec 01 10:37:48 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: cache status {prefix=cache status} (starting...)
Dec 01 10:37:48 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:48 compute-2 lvm[253531]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 01 10:37:48 compute-2 lvm[253531]: VG ceph_vg0 finished
Dec 01 10:37:48 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: client ls {prefix=client ls} (starting...)
Dec 01 10:37:48 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:48 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:48 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: damage ls {prefix=damage ls} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump loads {prefix=dump loads} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec 01 10:37:49 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2676791860' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:49 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:49 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:49 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:49.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2676791860' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:49 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 01 10:37:49 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:49 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 01 10:37:49 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1947193528' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:50 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:50 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:50 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:50.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec 01 10:37:50 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/155377163' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: ops {prefix=ops} (starting...)
Dec 01 10:37:50 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:50 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:50 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 01 10:37:50 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/684590407' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 01 10:37:50 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2173440090' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: pgmap v1423: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 610 B/s rd, 0 op/s
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.26800 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.17523 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.26815 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2575364442' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.27341 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1216379499' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.26824 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.17532 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1947193528' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.27356 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1431723179' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.17544 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/155377163' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/256025264' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1911794276' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.26833 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.27371 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3545264155' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 01 10:37:50 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/684590407' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:51 compute-2 sudo[253915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 01 10:37:51 compute-2 sudo[253915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 01 10:37:51 compute-2 sudo[253915]: pam_unix(sudo:session): session closed for user root
Dec 01 10:37:51 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: session ls {prefix=session ls} (starting...)
Dec 01 10:37:51 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec 01 10:37:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 01 10:37:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731355053' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: status {prefix=status} (starting...)
Dec 01 10:37:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:51 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:51 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:51 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:51 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:51 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:51.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 01 10:37:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2346899881' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: pgmap v1424: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.17562 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.27386 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2173440090' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/287918769' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2564244990' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.26869 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2024571894' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2815357942' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1731355053' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.17583 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.26881 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.27422 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2608612401' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2909162309' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2346899881' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:51 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec 01 10:37:51 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3268875706' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 01 10:37:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3006238464' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:52 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:52 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:52 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:52.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 01 10:37:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3610990087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 01 10:37:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1094526060' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:52 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.26896 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.27431 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3268875706' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1025458884' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/708766779' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3006238464' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3430623412' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/369458245' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3610990087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1353961900' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1094526060' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4055162567' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/493650774' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:37:52 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 01 10:37:52 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3176255192' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 01 10:37:53 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1506031666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 01 10:37:53 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3669774970' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:53 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:53 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:53 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:53 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:53.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:53 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 01 10:37:53 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3158326277' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: pgmap v1425: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.26935 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3812692844' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3369796777' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3176255192' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3066064762' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.17631 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.27479 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1506031666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1197698938' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3669774970' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3661865303' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3575315556' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1671515067' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3158326277' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:37:53 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2033843320' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 01 10:37:54 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2926761762' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:54 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:54 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:37:54 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:54.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:37:54 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:54 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:54 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 01 10:37:54 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3445514895' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:05.963583+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:06.963728+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:07.963872+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:08.964039+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e5a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:09.964218+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874eb40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:10.964401+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:11.964630+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:12.964816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:13.964986+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:14.965149+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:15.965319+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:16.965491+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:17.965687+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:18.967663+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:19.967840+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:20.968048+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:21.968275+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:22.968446+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:23.968630+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:24.968814+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:25.969015+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:26.969241+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:27.969451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:28.969649+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:29.969857+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:30.970021+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:31.970240+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:32.970454+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.810539246s of 37.817691803s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:33.970689+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 1146880 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:34.970835+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x5636586f0f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:35.971068+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:36.971219+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:37.971385+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:38.971524+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:39.971908+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:40.972048+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:41.972276+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:42.972428+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:43.972948+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:44.973202+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:45.973454+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:46.973678+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:47.973928+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:48.974149+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.038652420s of 16.043939590s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:49.974284+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:50.974445+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:51.974654+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:52.974865+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:53.975184+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:54.975363+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:55.975523+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:56.975718+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:57.975868+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:58.976086+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:05:59.976340+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:00.976576+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:01.976853+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:02.977247+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:03.977445+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:04.977589+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:05.977823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:06.978016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:07.978201+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:08.978379+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:09.978542+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:10.978738+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:11.979024+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:12.979186+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:13.979369+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:14.979530+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:15.979744+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:16.979899+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:17.980212+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:18.981307+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:19.982664+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:20.984079+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:21.984296+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:22.984447+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:23.984644+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:24.985095+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:25.985260+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:26.985425+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:27.985572+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:28.986105+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:29.986293+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:30.986481+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:31.987004+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:32.987190+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:33.987645+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:34.987972+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:35.988148+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:36.988460+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:37.988662+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:38.988830+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:39.989052+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:40.989338+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:41.989550+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:42.989762+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:43.989903+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:44.990050+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:45.990236+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:46.990705+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:47.990878+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:48.991159+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636586f12c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:49.991492+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:50.991686+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:51.991850+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:52.992020+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:53.992190+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:54.992356+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:55.992538+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:56.992707+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:57.992904+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:58.993035+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:06:59.993211+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:00.993354+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:01.993543+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:02.993730+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 74.315505981s of 74.319114685s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:03.993893+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:04.994092+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:05.994313+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:06.994478+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:07.994691+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:08.994839+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:09.995039+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:10.995230+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:11.995426+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:12.995607+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:13.995851+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:14.996093+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:15.996501+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:16.996670+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.970045090s of 13.995903015s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:17.996836+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:18.997017+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:19.997190+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:20.997394+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:21.997679+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:22.997880+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:23.998050+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:24.998193+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:25.998349+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:26.998495+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:27.998653+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:28.998832+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:29.999005+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:30.999152+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:31.999330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:32.999493+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:33.999681+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563655b9ab40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:34.999857+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:36.000002+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:37.000149+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:38.000485+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:39.000679+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:40.000898+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:41.001121+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:42.001377+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:43.001627+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:44.001794+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:45.002006+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:46.002155+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:47.002302+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:48.002476+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:49.002686+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:50.002864+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.174911499s of 33.210174561s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:51.003066+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:52.003265+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:53.003423+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:54.003636+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:55.003793+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:56.003976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:57.004136+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:58.004305+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:07:59.004481+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:00.004669+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:01.004816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:02.005060+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851969 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:03.005232+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.003420830s of 13.025437355s, submitted: 3
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:04.005422+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:05.005661+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:06.005871+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:07.006079+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:08.006275+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:09.006480+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:10.006643+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:11.006887+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:12.007112+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:13.007292+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:14.007482+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:15.007664+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:16.007926+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874f680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:17.008230+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:18.008469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:19.008671+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:20.008862+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:21.009051+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:22.009255+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:23.009428+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:24.009645+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:25.009806+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:26.010001+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:27.010196+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:28.010394+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:29.010557+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:30.010710+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.606863022s of 26.614999771s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:31.010904+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:32.011212+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852890 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:33.011395+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:34.011550+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:35.011690+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:36.011846+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:37.012050+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852299 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:38.012225+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:39.012397+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:40.012576+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:41.012812+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:42.013060+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:43.013223+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:44.013350+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:45.013539+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:46.013718+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:47.013864+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:48.014015+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:49.014173+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:50.014322+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:51.014452+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:52.014651+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:53.014814+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:54.014987+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:55.015144+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:56.015323+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:57.015496+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:58.015696+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:08:59.015905+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:00.016187+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:01.016356+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:02.016546+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:03.016748+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:04.016895+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:05.017064+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:06.017248+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:07.017413+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:08.017630+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:09.017808+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:10.018091+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563658762b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:11.018307+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:12.018571+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:13.018970+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:14.019147+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:15.019366+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:16.019516+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:17.019710+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:18.019838+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:19.019988+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:20.020162+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:21.020406+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:22.020750+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:23.020909+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:24.021130+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.971313477s of 53.995193481s, submitted: 3
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:25.021273+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:26.021522+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:27.021698+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Cumulative writes: 5898 writes, 24K keys, 5898 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5898 writes, 1028 syncs, 5.74 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 466 writes, 729 keys, 466 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s
                                           Interval WAL: 466 writes, 228 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.020       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5636543369b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.2 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:28.021955+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:29.022125+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:30.022342+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:31.022483+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:32.022694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:33.022930+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:34.023147+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:35.023332+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:36.023564+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:37.023821+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:38.023975+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658211000 session 0x563658763680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365876e780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:39.024194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:40.024426+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:41.024730+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:42.024964+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:43.025177+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:44.025372+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:45.025708+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:46.026012+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:47.026275+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:48.026583+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:49.026974+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:50.027176+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:51.027361+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:52.027580+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:53.027817+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:54.028044+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:55.028235+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.067621231s of 31.088729858s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:56.028461+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:57.028683+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854141 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:58.028921+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:09:59.029131+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:00.029293+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:01.029464+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:02.029676+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:03.029832+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:04.030506+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:05.030680+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:06.030888+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:07.031040+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:08.031205+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:09.031393+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:10.031533+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:11.031675+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:12.031895+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:13.032040+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:14.032188+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:15.032339+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:16.032543+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:17.032667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:18.032826+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:19.032975+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:20.033211+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:21.033387+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:22.033568+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:23.033742+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:24.033939+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:25.034105+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:26.034286+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:27.034472+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.795867920s of 31.820896149s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:28.034627+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:29.034776+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:30.035016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:31.035188+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:32.035397+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:33.035561+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:34.035725+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:35.035928+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:36.036065+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:37.036233+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:38.036377+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:39.036537+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:40.036694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:41.036847+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:42.037039+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:43.037186+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:44.037355+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:45.037510+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:46.037673+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:47.037837+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:48.037990+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365793f860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:49.038117+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:50.038269+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:51.038416+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:52.038588+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:53.038752+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:54.039061+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563658688b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:55.039240+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:56.039432+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:57.039560+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:58.039731+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:10:59.039897+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:00.040084+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:01.040259+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:02.040492+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.482105255s of 35.065586090s, submitted: 214
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855062 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:03.040672+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:04.040874+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:05.041081+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:06.041245+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:07.041399+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 856574 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:08.042498+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:09.043271+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:10.043810+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:11.044158+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:12.044469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859598 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:13.045114+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:14.045816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:15.046580+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:16.046947+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:17.047153+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.189227104s of 15.369211197s, submitted: 4
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:18.047698+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:19.048017+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:20.048215+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:21.048574+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:22.048881+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:23.049246+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:24.049534+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:25.049718+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:26.049961+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:27.050231+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:28.050513+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:29.050739+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:30.050976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:31.051271+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:32.051557+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:33.051811+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:34.052020+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:35.052225+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:36.052446+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:37.052662+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:38.052823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:39.053108+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:40.053361+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:41.053663+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:42.053949+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:43.054121+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:44.054353+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:45.054641+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:46.054914+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:47.055173+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:48.055385+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:49.055670+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:50.055893+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:51.056200+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:52.056512+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:53.056705+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:54.056955+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:55.057185+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:56.057350+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e960
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:57.057510+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:58.057732+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:11:59.057989+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:00.058146+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636587c41e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:01.058390+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:02.058636+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:03.058819+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:04.059000+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:05.059166+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:06.059378+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:07.059607+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:08.059819+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:09.060029+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:10.060200+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:11.060410+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:12.060691+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:13.060854+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.563137054s of 55.571445465s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:14.061105+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:15.061402+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:16.061650+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:17.061865+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:18.062058+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859928 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:19.062280+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:20.062487+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:21.062709+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:22.062933+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:23.063178+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860849 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:24.063378+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:25.063571+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.061788559s of 12.072693825s, submitted: 3
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:26.063828+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:27.064005+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:28.064289+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:29.064517+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:30.064747+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:31.064950+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:32.065214+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:33.065378+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:34.065667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:35.065942+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:36.066200+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:37.066457+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:38.066764+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:39.067159+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:40.067726+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:41.068177+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:42.068752+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:43.069034+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:44.069344+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:45.069823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:46.070116+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:47.070337+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:48.070574+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:49.070823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:50.071030+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.325607300s of 25.328844070s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:51.071254+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:52.071542+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:53.071783+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:54.072009+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:55.072210+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:56.072441+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:57.074179+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:58.074422+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:12:59.074677+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:00.074904+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:01.075054+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:02.075325+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:03.075559+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:04.075833+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:05.076075+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:06.076356+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:07.076713+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:08.076945+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:09.077146+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:10.077377+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:11.077702+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:12.078036+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:13.078267+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:14.078496+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:15.078729+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:16.078935+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:17.079110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:18.079250+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:19.079413+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.008197784s of 29.011583328s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:20.079657+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:21.079914+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:22.080246+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:23.080431+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:24.080713+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:25.080942+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:26.081215+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:27.081485+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:28.081690+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:29.081901+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:30.082169+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:31.082362+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:32.082705+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:33.082938+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:34.083176+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:35.083391+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:36.083652+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x5636587ca000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:37.083854+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:38.084061+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:39.084209+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:40.084382+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:41.084560+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:42.084806+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:43.085034+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:44.085210+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:45.085393+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:46.085646+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:47.085818+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:48.086055+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:49.086312+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:50.086486+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 1728512 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.283664703s of 30.289636612s, submitted: 2
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:51.086667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74735616 unmapped: 1687552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:52.086866+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 696320 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 139 ms_handle_reset con 0x563657206c00 session 0x5636587d4000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:53.087034+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 663552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965957 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 139 heartbeat osd_stat(store_statfs(0x4fc27b000/0x0/0x4ffc00000, data 0x8ed7bb/0x99f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:54.087279+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 140 ms_handle_reset con 0x563658210800 session 0x5636587c50e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:55.087468+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:56.087645+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:57.087871+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:58.088091+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd5f919/0xe12000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969647 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:13:59.088279+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:00.088499+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993000031s of 10.192948341s, submitted: 43
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:01.088688+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:02.088961+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:03.089203+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970213 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:04.089457+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:05.089652+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:06.089897+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:07.090152+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:08.090414+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969622 data_alloc: 218103808 data_used: 73728
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:09.090677+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:10.091148+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:11.091404+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:12.091700+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:13.091945+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969855 data_alloc: 218103808 data_used: 77824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:14.092190+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.708740234s of 14.725612640s, submitted: 13
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:15.092377+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:16.092578+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:17.092785+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:18.092949+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:19.093250+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:20.093465+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:21.093742+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:22.094038+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:23.094355+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:24.094553+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:25.094791+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:26.095070+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:27.095231+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:28.095451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:29.095742+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:30.095997+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:31.096272+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:32.096555+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210800 session 0x5636587d4f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636587d50e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44400 session 0x5636587d54a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:33.096770+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76832768 unmapped: 16375808 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.942977905s of 18.945615768s, submitted: 1
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44800 session 0x5636587d5860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:34.096933+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:35.097097+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636572210e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:36.097300+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:37.097520+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x563657ed6f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x56365876e1e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:38.097707+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x56365874f680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365874f0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365861c960
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1025258 data_alloc: 218103808 data_used: 81920
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:39.097945+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:40.098182+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:41.098436+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657206c00 session 0x5636587ca3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb9f0000/0x0/0x4ffc00000, data 0x1174c29/0x122b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:42.098712+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:43.098974+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x5636587ca000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1024667 data_alloc: 218103808 data_used: 81920
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:44.099200+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x5636587c41e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.451130867s of 10.229439735s, submitted: 42
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 13893632 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x5636586892c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:45.099447+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:46.099672+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:47.099946+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 80183296 unmapped: 13025280 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:48.100117+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:49.100357+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:50.100556+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:51.100805+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:52.101053+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:53.101242+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:54.101383+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:55.101511+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:56.101660+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:57.101822+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.154132843s of 13.616702080s, submitted: 20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:58.101976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 88580096 unmapped: 4628480 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142440 data_alloc: 218103808 data_used: 3985408
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587cb4a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:14:59.102156+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 1851392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:00.102344+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9d5d000/0x0/0x4ffc00000, data 0x1c58c74/0x1d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:01.102492+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:02.102712+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:03.102886+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161392 data_alloc: 218103808 data_used: 4464640
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:04.103044+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90415104 unmapped: 3842048 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:05.103159+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90423296 unmapped: 3833856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:06.103325+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:07.103501+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:08.103642+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161544 data_alloc: 218103808 data_used: 4534272
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:09.103825+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:10.103989+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395915031s of 12.716011047s, submitted: 144
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c8d000/0x0/0x4ffc00000, data 0x1d36c74/0x1def000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:11.104154+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:12.104326+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:13.104472+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1164216 data_alloc: 218103808 data_used: 4538368
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:14.104677+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:15.104818+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:16.105040+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:17.105183+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:18.105392+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90660864 unmapped: 3596288 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166420 data_alloc: 218103808 data_used: 4538368
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:19.105547+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636587cc960
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:20.105676+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:21.106016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:22.106310+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:23.106460+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165829 data_alloc: 218103808 data_used: 4538368
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:24.106695+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.344936371s of 14.540517807s, submitted: 9
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657b5af00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:25.106867+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636577f14a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90554368 unmapped: 4751360 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:26.107662+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587623c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91594752 unmapped: 19603456 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cd0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c52f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x563657221c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:27.108353+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:28.109081+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266778 data_alloc: 218103808 data_used: 4538368
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:29.109252+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:30.109572+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:31.109919+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc74/0x2b68000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:32.110277+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657911c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:33.110523+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91652096 unmapped: 19546112 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:34.110736+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271440 data_alloc: 218103808 data_used: 4542464
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:35.110861+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:36.111166+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:37.111414+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:38.111547+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100147200 unmapped: 11051008 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:39.111714+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366136 data_alloc: 234881024 data_used: 18526208
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104595456 unmapped: 6602752 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.594265938s of 14.785900116s, submitted: 47
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:40.111970+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:41.112286+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:42.112584+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:43.112762+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:44.113065+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1364363 data_alloc: 234881024 data_used: 18526208
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:45.113249+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:46.113426+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:47.113680+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:48.113869+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:49.114060+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1393335 data_alloc: 234881024 data_used: 18522112
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 5136384 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:50.114267+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925302505s of 10.360248566s, submitted: 103
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82aa000/0x0/0x4ffc00000, data 0x3718c97/0x37d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112877568 unmapped: 2727936 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:51.114493+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113098752 unmapped: 2506752 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.26983 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/758342641' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1670393597' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.27521 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2926761762' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.26998 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.17670 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1949553522' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3445514895' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4020519107' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:54 compute-2 ceph-mon[76053]: from='client.27016 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:52.114810+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:53.114972+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f87000/0x0/0x4ffc00000, data 0x3a2dc97/0x3ae7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f92000/0x0/0x4ffc00000, data 0x3a30c97/0x3aea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:54.115136+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486231 data_alloc: 234881024 data_used: 19931136
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:55.115330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:56.115671+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:57.116212+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:58.116397+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f8d000/0x0/0x4ffc00000, data 0x3a35c97/0x3aef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563658762b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587c4d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:15:59.117160+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486275 data_alloc: 234881024 data_used: 19931136
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101130240 unmapped: 14475264 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:00.117319+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.153506279s of 10.047043800s, submitted: 85
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636578ac780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100425728 unmapped: 15179776 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:01.117673+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:02.118009+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c75000/0x0/0x4ffc00000, data 0x1d4dc74/0x1e06000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:03.118304+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:04.118755+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182611 data_alloc: 218103808 data_used: 4526080
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:05.119149+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:06.119314+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:07.119642+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c73000/0x0/0x4ffc00000, data 0x1d50c74/0x1e09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:08.119934+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563656b24000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:09.120179+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018203 data_alloc: 218103808 data_used: 90112
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563656312d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:10.120449+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:11.120709+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:12.120996+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:13.121242+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:14.121484+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:15.121728+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:16.121947+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:17.122165+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:18.122323+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:19.122719+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:20.122937+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:21.123110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:22.123321+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:23.123574+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:24.123780+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:25.124008+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:26.124187+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:27.124400+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:28.124695+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:29.124978+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:30.125150+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:31.125346+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655bc50e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc4780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc5860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658211000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x56365876f0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.414421082s of 31.503835678s, submitted: 50
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 105201664 unmapped: 23019520 heap: 128221184 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x56365876ef00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:32.125473+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed74a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:33.125583+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655b9b4a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:34.125794+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:35.125929+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:36.126070+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:37.126237+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:38.126421+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:39.126553+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:40.126708+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579a3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:41.126900+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:42.127091+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96018432 unmapped: 35880960 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:43.127322+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 98639872 unmapped: 33259520 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:44.127519+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:45.127671+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:46.127847+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:47.127959+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:48.128129+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:49.128240+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:50.128402+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:51.128541+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:52.128792+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657973c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658210800 session 0x5636579732c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:53.128961+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102580224 unmapped: 29319168 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.930128098s of 22.232917786s, submitted: 48
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:54.129122+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,3])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308349 data_alloc: 234881024 data_used: 13668352
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 21618688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:55.129337+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:56.129545+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:57.129750+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:58.129927+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111312896 unmapped: 20586496 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:16:59.130073+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318541 data_alloc: 234881024 data_used: 13930496
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 20553728 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:00.130234+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 20406272 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:01.130378+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:02.130962+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:03.131170+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:04.131816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316725 data_alloc: 234881024 data_used: 13930496
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:05.132372+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28afcc6/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:06.132530+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:07.132710+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.554829597s of 13.991487503s, submitted: 126
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:08.132983+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:09.133298+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f910e000/0x0/0x4ffc00000, data 0x28b5cc6/0x296e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316973 data_alloc: 234881024 data_used: 13930496
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658210800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:10.133479+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:11.133791+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:12.134076+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:13.134349+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:14.134552+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319549 data_alloc: 234881024 data_used: 13942784
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9103000/0x0/0x4ffc00000, data 0x28c0cc6/0x2979000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:15.134741+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:16.134976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:17.135177+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 21725184 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.098366737s of 10.114171028s, submitted: 5
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:18.135370+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x5636585872c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c534a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:19.135569+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:20.135785+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:21.135960+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:22.136183+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:23.136422+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:24.136633+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:25.136956+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:26.137133+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:27.137439+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:28.137687+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:29.137927+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:30.138129+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:31.138400+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:32.138672+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:33.138918+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:34.139090+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:35.139272+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:36.139451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:37.139579+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:38.139750+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:39.139916+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:40.140089+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:41.140278+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:42.140482+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:43.140573+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636578fc1e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365861da40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:44.140784+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657ed7680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101466112 unmapped: 30433280 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.040988922s of 26.121164322s, submitted: 35
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed7c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636585863c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365876ef00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657206c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111180 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc4b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:45.140968+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:46.141150+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:47.141336+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:48.141515+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:49.254025+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1110956 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:50.254168+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc5860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:51.254309+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:52.254659+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:53.254792+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101974016 unmapped: 33603584 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:54.254943+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:55.255079+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:56.255243+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:57.255407+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:58.256135+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:17:59.256330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:00.256538+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:01.256722+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:02.256900+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.628173828s of 18.831754684s, submitted: 39
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:03.257096+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 24641536 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9141000/0x0/0x4ffc00000, data 0x246bc64/0x2523000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:04.257296+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112017408 unmapped: 23560192 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312254 data_alloc: 234881024 data_used: 11382784
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:05.257496+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:06.258068+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:07.258242+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f8c000/0x0/0x4ffc00000, data 0x2628c64/0x26e0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:08.258376+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:09.258980+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319394 data_alloc: 234881024 data_used: 11612160
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:10.259303+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:11.259786+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:12.260391+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:13.260798+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:14.260999+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321018 data_alloc: 234881024 data_used: 11685888
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:15.261393+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:16.261672+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.655291557s of 14.004703522s, submitted: 146
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:17.262094+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 24895488 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:18.262439+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:19.262720+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321058 data_alloc: 234881024 data_used: 11685888
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:20.262972+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:21.263247+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:22.263515+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:23.263762+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:24.263917+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 24870912 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321202 data_alloc: 234881024 data_used: 11685888
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x56365793ef00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365793eb40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:25.264132+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98000 session 0x56365793f4a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cbc20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110714880 unmapped: 24862720 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587caf00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x5636587cab40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:26.264287+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:27.264528+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:28.264778+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:29.265004+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f85c2000/0x0/0x4ffc00000, data 0x2ff1cc6/0x30aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 23609344 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1398065 data_alloc: 234881024 data_used: 11685888
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:30.265202+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636587ca3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:31.265368+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:32.265657+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636572454a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.045106888s of 15.331792831s, submitted: 38
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657244780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:33.265832+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 24444928 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859d000/0x0/0x4ffc00000, data 0x3015cd6/0x30cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:34.266035+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400939 data_alloc: 234881024 data_used: 11685888
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:35.266749+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:36.267194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 24412160 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:37.293184+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 17653760 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:38.293327+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:39.299528+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:40.299725+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:41.299880+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:42.300092+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:43.300271+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:44.300545+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 16179200 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:45.300717+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:46.300908+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:47.301090+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.843387604s of 15.061408043s, submitted: 10
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 13910016 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:48.301307+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 12812288 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:49.301453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 10928128 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1578891 data_alloc: 234881024 data_used: 21770240
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:50.301668+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 10395648 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:51.301821+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:52.302022+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:53.302762+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:54.302901+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1580563 data_alloc: 234881024 data_used: 21909504
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:55.303045+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365678f860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636586f05a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:56.303179+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:57.303310+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636579730e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:58.303449+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:18:59.303622+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336708 data_alloc: 234881024 data_used: 10829824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:00.303771+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:01.303927+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.856204987s of 14.509012222s, submitted: 130
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636563130e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x5636577f0f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:02.304099+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 25501696 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563656312d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f57000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:03.304239+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:04.304405+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:05.304503+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:06.304664+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:07.304840+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:08.304998+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:09.305157+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:10.305335+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:11.305514+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:12.305735+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:13.305884+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:14.305983+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:15.306172+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:16.306404+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:17.306550+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:18.306682+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:19.306819+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:20.306957+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:21.307108+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:22.307316+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:23.307453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:24.307635+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:25.307823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:26.307957+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:27.308095+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.2 total, 600.0 interval
                                           Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s
                                           Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d4b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5fc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587d5680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d43c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c22f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.636785507s of 26.091878891s, submitted: 58
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c225a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:28.308230+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563656b24000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657207000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c310e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:29.308396+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123514 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:30.308555+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365874e780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:31.308674+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874e000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636587f6800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365874f2c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:32.308887+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365874f0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:33.309035+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:34.309175+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155373 data_alloc: 218103808 data_used: 4284416
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:35.309309+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:36.309510+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:37.309711+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:38.309885+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:39.310020+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:40.310182+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:41.310409+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:42.310666+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:43.310811+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:44.311918+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:45.312099+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.438076019s of 17.504392624s, submitted: 12
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:46.312985+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:47.313151+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 30244864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:48.313794+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97fe000/0x0/0x4ffc00000, data 0x1db4c84/0x1e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:49.314518+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256563 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:50.315134+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:51.315720+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:52.316260+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97ee000/0x0/0x4ffc00000, data 0x1dc4c84/0x1e7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:53.316702+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:54.317099+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:55.317499+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:56.317881+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:57.318139+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:58.318298+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:19:59.318451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:00.318708+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:01.319047+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:02.319279+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:03.319440+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:04.319664+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:05.319842+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.239578247s of 20.645868301s, submitted: 49
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:06.320101+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:07.320263+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:08.320501+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:09.320665+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed61e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563657ed6780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed65a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7a000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed7e00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7a000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:10.320806+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290770 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed6f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657ed7c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563655c534a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed6000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b000 session 0x56365874fa40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:11.320948+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:12.321228+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:13.321394+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:14.321667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cad20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:15.321911+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296076 data_alloc: 218103808 data_used: 8269824
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:16.322044+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d3000/0x0/0x4ffc00000, data 0x20deca7/0x2199000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:17.322205+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:18.322890+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.166189194s of 12.288041115s, submitted: 40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:19.323049+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:20.323878+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:21.324052+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:22.324429+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:23.324662+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:24.324910+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:25.325045+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:26.325180+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:27.325329+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 23896064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:28.325524+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.762595177s of 10.001276016s, submitted: 110
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 23732224 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:29.325686+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 21815296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a60000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:30.325905+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655c17000 session 0x563657244000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658d10000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:31.326110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:32.326411+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:33.326660+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:34.326823+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:35.327003+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:36.327203+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:37.327356+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:38.327515+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:39.327675+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:40.327911+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.802636147s of 12.282186508s, submitted: 208
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:41.328096+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:42.328355+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:43.328504+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:44.328767+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6e000/0x0/0x4ffc00000, data 0x2b43ca7/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:45.328918+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395262 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:46.329077+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:47.329262+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:48.329414+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:49.329564+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:50.329732+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:51.329864+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:52.330086+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:53.330290+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:54.330464+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:55.330621+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.778802872s of 14.837076187s, submitted: 4
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:56.330776+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:57.330950+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:58.331093+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:20:59.331219+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:00.331344+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395422 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6d000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:01.331495+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:02.331711+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:03.331863+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:04.332006+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:05.332146+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395430 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:06.332310+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.249480247s of 11.264521599s, submitted: 5
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:07.332467+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:08.332667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:09.332842+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:10.332976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395446 data_alloc: 234881024 data_used: 11534336
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:11.333131+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:12.333394+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:13.333541+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:14.333694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6c000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:15.333820+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 23363584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397446 data_alloc: 234881024 data_used: 11522048
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:16.333988+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.100803375s of 10.120236397s, submitted: 16
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587c5680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x5636587c4d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:17.334178+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7bc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x5636587cc1e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:18.334377+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:19.334561+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:20.334914+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:21.335096+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:22.335290+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:23.335536+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:24.335701+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:25.335876+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:26.336074+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:27.336239+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:28.336382+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657911860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587c5c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.864808083s of 12.033938408s, submitted: 54
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:29.336572+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 32022528 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:30.336750+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365678f860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:31.336918+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:32.337167+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:33.337352+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:34.337531+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:35.337744+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:36.337958+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:37.338134+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:38.338383+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:39.338529+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:40.338755+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:41.338923+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:42.339183+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:43.339363+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:44.339627+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:45.339829+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:46.339992+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:47.340207+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:48.340363+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:49.340577+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:50.340902+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:51.341095+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:52.342011+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:53.342244+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:54.342416+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:55.342734+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:56.343375+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:57.343643+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cba40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587cbc20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563655aab0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365678f0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.008203506s of 29.148126602s, submitted: 24
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:58.343834+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110739456 unmapped: 32710656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655b9b2c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563657ed7860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x5636577d9800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636586f0780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7bc00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x563655c31e00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636577f03c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:21:59.344058+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc4000/0x0/0x4ffc00000, data 0x18f0c51/0x19a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:00.344479+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188936 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:01.344676+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:02.344839+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110673920 unmapped: 32776192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636563121e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:03.345027+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:04.345248+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:05.345373+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110911488 unmapped: 32538624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245921 data_alloc: 218103808 data_used: 8433664
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:06.345560+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:07.345802+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:08.346196+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:09.346418+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:10.346792+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: mgrc ms_handle_reset ms_handle_reset con 0x563655c16000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec 01 10:37:54 compute-2 ceph-osd[78644]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: get_auth_request con 0x563659b7bc00 auth_method 0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: mgrc handle_mgr_configure stats_period=5
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.887072563s of 12.284139633s, submitted: 32
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636586f1680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 27484160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563659b7b800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267991 data_alloc: 234881024 data_used: 11882496
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed6000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:11.347013+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c64/0xe1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:12.347326+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:13.347567+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:14.347805+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:15.347954+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:16.348102+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:17.348299+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:18.348445+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:19.348638+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:20.348831+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:21.349055+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:22.349291+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:23.349513+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862092018s of 12.971082687s, submitted: 35
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,6])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111779840 unmapped: 31670272 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x563656b24780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636577f05a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365861cd20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876fe00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874eb40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:24.349736+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:25.349875+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1160474 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e44800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:26.350036+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e44800 session 0x56365678e780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:27.350223+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587c4960
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 31752192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563656b661e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:28.350423+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:29.350576+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657e45400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:30.350738+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 31711232 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165862 data_alloc: 218103808 data_used: 208896
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:31.350867+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:32.351070+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:33.351511+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:34.351638+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.460638046s of 11.302368164s, submitted: 37
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x56365579a3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e45400 session 0x5636579730e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:35.351758+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111558656 unmapped: 31891456 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117454 data_alloc: 218103808 data_used: 94208
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:36.351971+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 32104448 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84c000/0x0/0x4ffc00000, data 0xd67c74/0xe20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:37.352161+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111362048 unmapped: 32088064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:38.352310+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111370240 unmapped: 32079872 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:39.352453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c31e00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:40.352686+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:41.352851+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:42.353031+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:43.353208+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:44.353354+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:45.353576+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:46.353795+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:47.353947+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:48.354167+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:49.354380+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:50.417670+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:51.417875+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:52.418086+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:53.418315+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:54.418469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:55.418607+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:56.418772+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:57.418956+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:58.419096+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:22:59.419384+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:00.419507+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:01.419740+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:02.419936+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:03.420126+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:04.420301+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:05.420450+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:06.420612+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c521e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563656719800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655aabc20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658e52000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636572450e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636586f14a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:07.420764+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.799690247s of 32.735015869s, submitted: 47
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365579a000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:08.420914+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636586f10e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:09.421125+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:10.421256+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154538 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d5e00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:11.421411+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655bc4780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f59c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f59c00 session 0x563657972f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:12.421582+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 32301056 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:13.421826+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 32292864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:14.421966+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111181824 unmapped: 32268288 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:15.422088+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:16.422248+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:17.422490+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:18.422655+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:19.422868+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:20.422993+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:21.423167+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:22.423333+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:23.423490+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:24.423680+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:25.424178+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764535904s of 18.126991272s, submitted: 35
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 25804800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281861 data_alloc: 218103808 data_used: 5279744
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:26.424352+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 25427968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9825000/0x0/0x4ffc00000, data 0x1d81ca3/0x1e39000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:27.424512+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:28.424669+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:29.424841+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:30.424991+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285985 data_alloc: 218103808 data_used: 5517312
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:31.425121+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:32.425276+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:33.425447+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:34.425626+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:35.425850+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286001 data_alloc: 218103808 data_used: 5517312
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:36.425994+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:37.426152+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:38.426287+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:39.426424+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862953186s of 14.593469620s, submitted: 123
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587c4b40
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365793f860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:40.426638+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284997 data_alloc: 218103808 data_used: 5517312
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:41.426829+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113876992 unmapped: 29573120 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x563657d3a3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:42.427037+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:43.427229+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:44.427399+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:45.427567+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:46.427751+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:47.427925+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:48.428136+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:49.428288+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:50.428445+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:51.428677+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:52.428927+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:53.429139+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:54.429285+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:55.429461+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:56.429659+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:57.429783+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:58.429963+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:23:59.430171+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:00.430336+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:01.430491+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:02.430654+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:03.430785+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:04.430925+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:05.431074+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:06.431248+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58800 session 0x5636587d50e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d5680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d45a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587d41e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:07.431361+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.469379425s of 27.278631210s, submitted: 38
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636563121e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658240400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x5636587d43c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658240400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x563655b94f00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c31c20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:08.431507+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:09.431684+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:10.431812+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1171981 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:11.432016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655aabc20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563658241800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365876e5a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:12.432171+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365876fe00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:13.432314+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876e780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114335744 unmapped: 29114368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:14.432453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113491968 unmapped: 29958144 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:15.432644+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:16.433042+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:17.433283+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:18.433480+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:19.433668+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:20.433811+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:21.434030+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:22.434268+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:23.434437+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:24.434619+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:25.434802+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.541067123s of 18.684326172s, submitted: 42
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 28631040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:26.434944+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264811 data_alloc: 218103808 data_used: 4960256
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 25976832 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:27.435119+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 23502848 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:28.435326+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587d4000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636563125a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97c00 session 0x5636577f1860
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:29.435502+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636586883c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365876e3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657244d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657afd400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657afd400 session 0x563657ed70e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636587d4d20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0cec/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:30.435715+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:31.435878+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337422 data_alloc: 218103808 data_used: 5795840
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:32.436129+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119332864 unmapped: 24117248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:33.436313+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 23994368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365678fc20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:34.436726+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d1d25/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365793e780
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119463936 unmapped: 23986176 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:35.437451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x56365631b000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x56365631b000 session 0x5636587cb680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:36.437865+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341137 data_alloc: 218103808 data_used: 5799936
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:37.438358+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93de000/0x0/0x4ffc00000, data 0x21d1d58/0x228e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 22298624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:38.438735+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 20692992 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:39.439114+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.288687706s of 13.754534721s, submitted: 156
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 20684800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:40.439317+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:41.439770+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:42.440161+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:43.440323+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:44.440697+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:45.440882+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:46.441050+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:47.441225+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123568128 unmapped: 19881984 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:48.441504+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128155648 unmapped: 15294464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:49.441791+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.859272957s of 10.086807251s, submitted: 50
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:50.442039+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:51.442199+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:52.442690+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:53.442908+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:54.443067+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:55.443256+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:56.443460+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:57.443694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:58.443933+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:24:59.444162+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:00.444379+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:01.444614+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1456703 data_alloc: 234881024 data_used: 12951552
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:02.444836+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:03.445059+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.702910423s of 13.609095573s, submitted: 14
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365876ef00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587ca3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657b5a3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:04.445265+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:05.445452+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:06.445627+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301916 data_alloc: 218103808 data_used: 5804032
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:07.445767+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x56365874ed20
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:08.445938+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f95c9000/0x0/0x4ffc00000, data 0x1bd8cb3/0x1c91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:09.446110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:10.446260+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:11.446451+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:12.446687+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:13.446850+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:14.447010+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:15.447194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:16.447337+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:17.447517+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:18.447689+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:19.447817+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:20.447980+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:21.448115+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:22.448320+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:23.448513+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:24.448678+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:25.448851+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:26.448974+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:27.449163+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:28.449351+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:29.449502+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:30.449677+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:31.449800+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:32.450021+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655d5f400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.523105621s of 29.712411880s, submitted: 70
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:33.450184+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656b24000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:34.450342+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:35.450475+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:36.450643+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254732 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:37.450788+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365678e3c0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:38.450931+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587c50e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:39.451182+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a98c00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636578fd0e0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636586885a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:40.451339+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563657f58400
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:41.451511+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122322944 unmapped: 28999680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1317052 data_alloc: 234881024 data_used: 9334784
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:42.451683+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:43.451872+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:44.452055+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:45.452216+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:46.452388+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348212 data_alloc: 234881024 data_used: 14024704
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:47.452586+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:48.452795+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:49.452971+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:50.453154+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:51.453301+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.175664902s of 18.268918991s, submitted: 18
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 127483904 unmapped: 23838720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1430746 data_alloc: 234881024 data_used: 14032896
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b00000/0x0/0x4ffc00000, data 0x269fc41/0x2756000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:52.453487+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132423680 unmapped: 18898944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:53.453669+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132751360 unmapped: 18571264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:54.453846+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:55.454044+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:56.454253+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132816896 unmapped: 18505728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453958 data_alloc: 234881024 data_used: 14815232
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:57.454391+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:58.454516+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:25:59.454725+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:00.454834+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:01.454979+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449662 data_alloc: 234881024 data_used: 14823424
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:02.455183+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:03.455412+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:04.455629+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.859183311s of 13.162115097s, submitted: 116
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:05.455785+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:06.455938+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655c534a0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449854 data_alloc: 234881024 data_used: 14823424
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a96800
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:07.456096+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365861cf00
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:08.456242+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:09.456476+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:10.457666+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:11.457995+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:12.458409+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:13.459042+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:14.459530+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:15.460041+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:16.460586+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:17.461103+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:18.461733+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:19.462478+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:20.462963+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:21.463274+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:22.463654+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:23.463943+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:24.464169+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:25.464678+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:26.465088+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:27.465462+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:28.465812+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:29.466208+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:30.466750+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:31.467100+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:32.467703+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:33.467943+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:34.468210+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:35.468450+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:36.468688+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:37.468983+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:38.469226+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:39.469554+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:40.469804+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:41.470068+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:42.471964+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:43.473364+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:44.475016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:45.475906+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:46.477261+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:47.478832+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:48.479051+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:49.479293+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:50.480238+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:51.481093+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:52.481578+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:53.481850+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:54.482207+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:55.482424+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:56.482754+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:57.483224+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:58.483461+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:26:59.483805+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:00.483999+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:01.484354+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:02.484726+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:03.484952+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:04.485293+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:05.485484+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:06.485657+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:07.485816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:08.485967+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:09.486123+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:10.486311+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:11.486487+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:12.486786+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:13.486952+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:14.487160+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:15.487309+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:16.487482+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:17.487661+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:18.487849+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:19.488002+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:20.488150+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:21.488284+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:22.488488+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122617856 unmapped: 28704768 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:23.488660+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:24.488824+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:25.488941+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:26.489101+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:27.489239+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:28.489385+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:29.489567+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:30.489658+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 29630464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:31.489799+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 29777920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:32.489976+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:33.490651+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:34.490782+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf dump' '{prefix=perf dump}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf schema' '{prefix=perf schema}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:35.490911+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:36.491073+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:37.491216+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121462784 unmapped: 29859840 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:38.491345+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:39.491469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:40.491598+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:41.491726+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:42.491916+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:43.492034+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:44.492239+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:45.492372+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:46.492527+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:47.492746+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:48.492890+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:49.493055+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:50.493194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:51.493387+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:52.493646+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:53.493803+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:54.493933+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:55.494069+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:56.494225+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:57.557649+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:58.557914+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:27:59.558044+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:00.558209+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:01.558352+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:02.558521+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:03.558667+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:04.558801+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:05.558955+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:06.559184+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:07.559359+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:08.559530+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:09.559739+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:10.559952+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:11.560145+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:12.560469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:13.560624+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:14.560895+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:15.561098+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:16.561340+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:17.561509+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:18.561682+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:19.561849+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:20.562015+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:21.562147+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:22.562367+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:23.562494+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:24.562669+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:25.562806+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:26.563016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 29786112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:27.563230+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 29786112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:28.563359+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119635968 unmapped: 31686656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:29.563507+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:30.563651+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:31.563788+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:32.564001+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:33.564237+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:34.564420+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:35.564559+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:36.564731+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:37.564923+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:38.565155+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:39.565280+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:40.565405+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:41.565557+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:42.565784+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:43.566000+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:44.566169+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 31670272 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:45.566388+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:46.566660+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:47.566830+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:48.566963+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:49.567173+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:50.567330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:51.567569+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:52.567786+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:53.568201+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:54.568405+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:55.568820+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:56.570692+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:57.570855+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119676928 unmapped: 31645696 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:58.571188+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:28:59.571516+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:00.571745+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:01.571935+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:02.572163+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:03.572710+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:04.572896+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:05.573418+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:06.573666+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:07.573795+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:08.574085+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:09.574297+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:10.574681+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:11.574882+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:12.575146+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:13.575374+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:14.575517+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:15.575710+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:16.576023+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:17.576179+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:18.576447+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:19.576748+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:20.577052+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 31612928 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:21.577312+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 31612928 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:22.577721+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:23.577845+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:24.578029+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:25.578217+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:26.578431+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.2 total, 600.0 interval
                                           Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 10K writes, 2921 syncs, 3.62 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2319 writes, 8267 keys, 2319 commit groups, 1.0 writes per commit group, ingest: 9.21 MB, 0.02 MB/s
                                           Interval WAL: 2319 writes, 967 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:27.578698+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:28.578945+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:29.579194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:30.579307+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:31.579503+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:32.579959+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:33.580156+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:34.580349+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:35.580467+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:36.580656+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:37.580836+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:38.581047+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:39.581201+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:40.581368+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:41.581487+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:42.581745+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:43.581946+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:44.582286+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:45.582443+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:46.582660+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:47.582793+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:48.582959+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:49.583138+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:50.583339+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:51.583496+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:52.583665+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:53.583802+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:54.583922+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:55.584090+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:56.584263+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:57.584382+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:58.584535+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:29:59.584704+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:00.584867+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:01.585018+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:02.586548+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:03.586719+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:04.586885+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:05.587011+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:06.587194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:07.587348+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:08.587466+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:09.587551+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:10.587680+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:11.587829+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:12.588012+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:13.588175+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:14.588334+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:15.588469+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:16.588609+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:17.588760+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:18.588908+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:19.589036+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:20.589198+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:21.589632+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:22.589799+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:23.589959+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:24.590101+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:25.590278+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:26.590414+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119783424 unmapped: 31539200 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:27.590639+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 263.094940186s of 263.126586914s, submitted: 17
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119783424 unmapped: 31539200 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:28.590774+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119832576 unmapped: 31490048 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:29.590925+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 31408128 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:30.591064+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119971840 unmapped: 31350784 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:31.591193+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119988224 unmapped: 31334400 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:32.591354+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 30244864 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:33.591539+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 30244864 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:34.591779+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:35.591931+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:36.592116+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:37.592265+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:38.592421+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:39.592644+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:40.592834+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:41.593004+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:42.593179+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:43.593342+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:44.593486+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:45.593642+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:46.593784+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:47.593909+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:48.594057+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:49.594209+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:50.594371+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:51.594498+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:52.594702+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:53.594892+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:54.595043+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:55.595168+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:56.595335+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:57.595526+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:58.595695+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:30:59.595846+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:00.596033+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:01.596404+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:02.596666+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:03.597503+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:04.597694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:05.598373+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:06.598552+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:07.599098+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:08.599254+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:09.599512+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:10.599679+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:11.599998+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:12.600182+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:13.600516+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:14.600671+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:15.600938+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:16.601167+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:17.601419+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:18.601567+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:19.601809+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:20.601961+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:21.602129+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:22.602318+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:23.602505+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:24.602673+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:25.602889+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:26.603046+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:27.603185+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:28.603366+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:29.603538+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:30.603704+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:31.603906+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:32.604100+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:33.604242+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:34.604408+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:35.604578+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:36.604756+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:37.604890+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:38.605033+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:39.605222+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:40.605371+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:41.605522+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:42.605745+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:43.605852+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:44.605960+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:45.606104+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:46.606208+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:47.606324+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:48.606446+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:49.606583+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:50.606738+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:51.606887+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:52.607063+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:53.607204+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:54.607371+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:55.607483+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:56.607649+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:57.607765+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:58.607926+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:31:59.608082+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:00.608208+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:01.608358+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:02.608534+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:03.608704+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:04.608909+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:05.609118+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:06.609330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:07.609468+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:08.609655+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:09.609866+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:10.610031+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:11.610200+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:12.610340+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:13.610511+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:14.610670+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:15.610849+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:16.610988+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:17.611154+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:18.611236+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:19.611435+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:20.611656+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:21.611815+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:22.612024+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:23.612194+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:24.612321+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:25.612461+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:26.612665+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:27.612825+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:28.613020+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:29.613186+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:30.613373+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:31.613482+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:32.613742+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:33.613887+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:34.614013+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:35.614180+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:36.614438+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:37.615452+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:38.615641+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:39.616670+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:40.616843+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:41.617741+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:42.617954+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:43.618573+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:44.618850+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:45.619046+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:46.619218+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:47.619563+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:48.619740+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:49.620039+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:50.620178+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:51.620464+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:52.620678+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:53.620859+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:54.621017+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:55.621250+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:56.621393+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:57.621536+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:58.621749+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:32:59.622008+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:00.622158+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:01.622431+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:02.622637+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:03.622889+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:04.623033+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:05.623254+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:06.623400+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:07.623586+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:08.623800+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:09.623920+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:10.624057+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:11.624178+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:12.624379+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:13.624532+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:14.624680+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:15.624840+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:16.624965+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:17.625122+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:18.625259+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:19.625429+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:20.625536+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:21.625668+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:22.625827+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:23.625939+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:24.626110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:25.626248+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:26.626390+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:27.626532+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:28.626707+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:29.626942+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:30.627110+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:31.627245+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:32.627425+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:33.627562+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:34.627697+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:35.627836+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:36.627982+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:37.628133+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:38.628279+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:39.628423+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:40.628605+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:41.631390+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:42.632070+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:43.635415+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:44.635557+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:45.636083+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:46.636243+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:47.637334+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:48.637524+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:49.638840+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:50.639268+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:51.641676+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:52.642189+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:53.642633+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:54.642943+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:55.643197+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:56.643344+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:57.643640+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:58.643964+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:33:59.644488+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:00.644811+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:01.645104+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:02.645395+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:03.645587+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:04.645783+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:05.645942+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:06.646112+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:07.646287+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:08.646465+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:09.646669+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:10.646841+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:11.647016+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:12.647252+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:13.647507+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:14.647779+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:15.647940+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:16.648119+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:17.648450+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:18.648752+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:19.649055+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:20.649326+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:21.649488+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:22.649717+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:23.649903+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:24.650165+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:25.650405+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:26.650546+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets getting new tickets!
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:27.650881+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _finish_auth 0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:27.651753+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:28.651088+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:29.651333+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:30.651552+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:31.651735+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:32.652012+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:33.652217+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:34.652367+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:35.652580+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:36.652809+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:37.653026+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:38.653172+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:39.653429+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:40.653628+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:41.653764+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:42.654021+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:43.654206+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:44.654395+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:45.654631+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:46.654788+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:47.654973+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:48.655176+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:49.655354+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:50.655558+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:51.655930+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:52.656121+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:53.656269+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:54.656453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:55.656666+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:56.656807+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:57.657048+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:58.657205+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:34:59.657345+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:00.657528+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:01.657724+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:02.657922+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:03.658120+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:04.658302+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:05.658455+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:06.658649+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:07.658830+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:08.659007+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:09.659199+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:10.659397+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:11.659635+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:12.659841+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:13.660017+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:14.660193+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:15.660383+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:16.660577+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:17.660805+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:18.661038+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:19.661263+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:20.661467+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:21.661663+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:22.661854+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:23.662062+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:24.662230+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:25.662364+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:26.662549+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:27.662809+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:28.662997+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:29.663154+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:30.663284+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658d10000 session 0x56365579b680
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: handle_auth_request added challenge on 0x563655a97000
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:31.663474+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:32.663659+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:33.663840+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:34.664028+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:35.664196+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:36.664371+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:37.664661+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:38.664798+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:39.664999+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:40.665134+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:41.665309+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:42.665508+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:43.665694+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:44.665830+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:45.665961+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:46.666136+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:47.666344+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:48.666501+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:49.666678+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:50.666826+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:51.666963+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:52.667149+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:53.667316+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:54.667444+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:55.667612+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:56.667758+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:57.667916+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:58.668077+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:35:59.668199+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:00.668330+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:01.668478+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:02.668656+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:03.668835+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:04.668941+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:05.669068+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:06.669201+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:07.669360+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:08.669472+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:09.669621+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:10.669747+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:11.669890+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:12.670121+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:13.670278+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:14.670449+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:15.670583+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:16.670739+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:17.670894+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:18.671049+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:19.671231+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:20.671353+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:21.671520+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:22.671721+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:23.671852+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:24.671985+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:25.672135+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:26.672270+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:27.672459+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:28.672658+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:29.672816+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:30.672980+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:31.673099+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:32.673462+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:33.673630+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:34.673788+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:35.673920+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:36.674062+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:37.674256+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:38.674428+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:39.674584+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:40.674768+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:41.674928+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:42.675128+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:43.675291+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:44.675419+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:45.675701+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:46.675857+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:47.676077+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:48.676263+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:49.676404+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:50.676558+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:51.676711+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:52.676929+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:53.677285+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:54.677676+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:55.677844+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:56.678199+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:57.678551+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:58.678866+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:36:59.679210+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:00.679426+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:01.679733+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:02.679911+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:03.680198+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:04.680453+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:05.680647+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:06.680832+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:07.681023+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:08.681281+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:09.681504+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:10.681654+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:11.681853+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:12.682020+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:13.682253+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:14.682447+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:15.682629+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:16.682830+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:17.682978+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:18.683123+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:19.683256+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 01 10:37:54 compute-2 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 01 10:37:54 compute-2 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:20.683390+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:21.683519+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:22.683657+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:23.683788+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec 01 10:37:54 compute-2 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: tick
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_tickets
Dec 01 10:37:54 compute-2 ceph-osd[78644]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-01T10:37:24.683925+0000)
Dec 01 10:37:54 compute-2 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec 01 10:37:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 01 10:37:55 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2037797509' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 01 10:37:55 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1242799633' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:55 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:55 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:55 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec 01 10:37:55 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:55.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec 01 10:37:55 compute-2 ceph-mon[76053]: pgmap v1426: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.27533 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.17682 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3150585897' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2037797509' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.27560 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2595146075' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.27034 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.17697 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4131228724' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1242799633' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.27572 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.27052 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/694297750' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/889218059' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 01 10:37:55 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 01 10:37:55 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4285385933' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:56 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:56 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:56 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:56.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:56 compute-2 crontab[254856]: (root) LIST (root)
Dec 01 10:37:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 01 10:37:56 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/301693546' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:37:56 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:56 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.17712 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.27079 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4285385933' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.27599 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3319515747' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.17727 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/618799525' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.27088 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.27611 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/301693546' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.17742 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/30622397' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:56 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1579245954' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 01 10:37:57 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2416539875' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:57 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:57 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:57 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:57 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:57.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:57 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 01 10:37:57 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3839600687' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27103 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: pgmap v1427: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27623 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.17754 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2642429074' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2123307550' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27115 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27638 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2416539875' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.17760 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27127 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.27656 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:57 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3839600687' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:37:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 01 10:37:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3481455124' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:37:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:58 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:58 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:58 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:58.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 01 10:37:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1517906889' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:37:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 01 10:37:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/563334009' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:37:58 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:58 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 01 10:37:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1344955448' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:37:58 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 01 10:37:58 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745238193' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.17769 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.27151 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3710131050' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.27674 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3481455124' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3662145449' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.17790 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3890453195' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.27689 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1517906889' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/563334009' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1113212562' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.17802 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3803315090' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1344955448' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/280147729' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1745238193' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 01 10:37:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2902631957' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 01 10:37:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1061196610' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 01 10:37:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2073495775' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:37:59 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:59 2025: (VI_0) received an invalid passwd!
Dec 01 10:37:59 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:37:59 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:37:59 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:59.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:37:59 compute-2 systemd[1]: Starting Hostname Service...
Dec 01 10:37:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 01 10:37:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2820665171' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:37:59 compute-2 systemd[1]: Started Hostname Service.
Dec 01 10:37:59 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 01 10:37:59 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4040223693' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:00 compute-2 ceph-mon[76053]: pgmap v1428: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/681427868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1076960667' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3109756647' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2902631957' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/4272453431' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1061196610' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1979824438' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2407988442' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2547643522' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2073495775' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1766030144' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2820665171' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2250972551' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1712938499' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4040223693' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1097426543' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/92804374' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 01 10:38:00 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:00 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:38:00 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:00.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:38:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 01 10:38:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/360044967' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 01 10:38:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3035638185' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:38:00 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:00 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:00 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 01 10:38:00 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/949704442' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 01 10:38:01 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2642147704' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3718359981' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/360044967' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3677329457' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3035638185' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2427246299' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1684233470' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1420896866' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/949704442' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3155100187' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3119880444' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/143276294' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2397045879' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:38:01 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:38:01 compute-2 podman[255617]: 2025-12-01 10:38:01.406457986 +0000 UTC m=+0.060898312 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 01 10:38:01 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:01 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:01 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:01 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:38:01 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:01.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:38:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 01 10:38:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4272606157' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:38:02 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:02 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:38:02 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:02.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:38:02 compute-2 ceph-mon[76053]: pgmap v1429: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.27262 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3753332438' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.27277 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2642147704' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3099532193' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2471595939' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.27794 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/385475963' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2036427775' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/4272606157' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec 01 10:38:02 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1852468334' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:38:02 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:02 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 01 10:38:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1543383899' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27806 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.17910 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27812 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27298 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27818 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.17925 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.17922 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27313 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.27836 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1852468334' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3780487315' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/1543383899' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2136226514' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/360571178' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 01 10:38:03 compute-2 podman[255942]: 2025-12-01 10:38:03.400737849 +0000 UTC m=+0.059361015 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 01 10:38:03 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 01 10:38:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2245100185' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:38:03 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:03 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:03 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:03 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:38:03 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:03.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:38:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:03 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:04 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:04 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:38:04 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:04.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:38:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 01 10:38:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/9411584' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: pgmap v1430: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.17931 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27325 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27848 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.17943 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27863 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27337 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2245100185' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.17955 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1068615254' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27349 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.27878 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/995261930' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3103096758' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/102627420' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:04 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/9411584' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:38:04 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:04 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.730 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 01 10:38:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 01 10:38:04 compute-2 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 01 10:38:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:04 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 01 10:38:05 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995769561' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.17976 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.27890 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.27379 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.17988 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.27917 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2492578862' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2976293796' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 01 10:38:05 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3995769561' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:38:05 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:05 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:05 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:05 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:38:05 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:05.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:38:05 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec 01 10:38:05 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112055686' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 01 10:38:06 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3136302660' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:38:06 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:06 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:38:06 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:06.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:38:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 01 10:38:06 compute-2 ceph-mon[76053]: pgmap v1431: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.18018 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.27409 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.27947 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1501445425' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/4168355367' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2112055686' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2496165473' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3136302660' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/3059862068' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 01 10:38:06 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3695918391' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:38:06 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:06 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:07 compute-2 nova_compute[230216]: 2025-12-01 10:38:07.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:38:07 compute-2 nova_compute[230216]: 2025-12-01 10:38:07.222 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:38:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 01 10:38:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2777371565' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.18057 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/2155323925' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3695918391' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/1497769564' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/1497804764' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/556825127' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.10:0/556825127' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/2234144395' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/2777371565' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:38:07 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:07 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:07 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:07 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 01 10:38:07 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:07.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 01 10:38:07 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 01 10:38:07 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3803043975' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:38:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:08 compute-2 nova_compute[230216]: 2025-12-01 10:38:08.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 01 10:38:08 compute-2 nova_compute[230216]: 2025-12-01 10:38:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 01 10:38:08 compute-2 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec 01 10:38:08 compute-2 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 01 10:38:08 compute-2 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:08.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 01 10:38:08 compute-2 ceph-mon[76053]: pgmap v1432: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.27454 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.27989 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3744287252' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.100:0/251663445' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.102:0/3803043975' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:38:08 compute-2 ceph-mon[76053]: from='client.? 192.168.122.101:0/3738989624' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 01 10:38:08 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:08 2025: (VI_0) received an invalid passwd!
Dec 01 10:38:08 compute-2 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 01 10:38:08 compute-2 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2813532657' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 01 10:38:09 compute-2 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:09 2025: (VI_0) received an invalid passwd!
